vitallens.js is the official JavaScript client for the VitalLens API, a service for estimating physiological vital signs like heart rate, respiratory rate, and heart rate variability (HRV) from facial video.
Using a different language or platform? We also have a Python client and iOS app.
-
Cross-Platform Compatibility:
Use vitallens.js in the browser or Node.js. -
Flexible Input Support: Process video files or live streams from a webcam or any MediaStream.
-
Multiple Estimation Methods: Choose the method that fits your needs:
vitallens: Provides heart rate, respiratory rate, and heart rate variability estimates. (Automatically selects the best available model for your plan. Requires an API key - get one for free on our website)g,chrom,pos: Offer less accurate heart rate estimates. (No API key required.)
-
Fast Face Detection & ROI Support:
Perform rapid face detection when required—or optionally, pass a global region of interest (ROI) to skip detection for even faster processing. -
Pre-Built Web Component Widgets:
In addition to the core API, vitallens.js provides ready-to-use web components. Try our simple vitals monitor widget (as seen in the above gif), or use an advanced widget showing vitals, video, and waveforms (supports both file and webcam modes).
Install vitallens.js via npm or yarn:
npm install vitallens
# or
yarn add vitallensThen use it as follows:
import { VitalLens } from 'vitallens';
const vl = new VitalLens({ method: 'vitallens', apiKey: 'YOUR_API_KEY' });
const result = await vl.processVideoFile(myVideoFile);
console.log(result);For browser usage, you have two options: using the pre-built Web Components or importing the core VitalLens class directly.
Option 1: Using the Web Components (Recommended)
This is the easiest way to get started. Just add the module script from a CDN, and you can use the custom elements directly in your HTML.
<script type="module" src="https://cdn.jsdelivr.net/npm/vitallens/dist/vitallens.browser.js"></script>
<!-- Easy-to-use vitals monitor with basic readings -->
<vitallens-vitals-monitor api-key="YOUR_API_KEY"></vitallens-vitals-monitor>
<!-- Advanced widget. Supports both file and webcam inputs, video and waveform views -->
<vitallens-widget api-key="YOUR_API_KEY"></vitallens-widget>Option 2: Using the Core API
If you need more control, you can import the VitalLens class directly into your own JavaScript module.
<video id="my-video" autoplay muted playsinline></video>
<script type="module">
// Import latest version from jsDelivr (recommended)
import { VitalLens } from 'https://cdn.jsdelivr.net/npm/vitallens';
// Or pin a specific version:
// import { VitalLens } from 'https://cdn.jsdelivr.net/npm/[email protected]';
// Or use Skypack:
// import { VitalLens } from 'https://cdn.skypack.dev/vitallens';
(async () => {
try {
const videoElement = document.getElementById('my-video');
const stream = await navigator.mediaDevices.getUserMedia({ video: true });
videoElement.srcObject = stream;
const vl = new VitalLens({ method: 'vitallens', apiKey: 'YOUR_API_KEY' });
await vl.setVideoStream(stream, videoElement); // Use await here
vl.addEventListener('vitals', (data) => console.log(data));
vl.startVideoStream();
} catch (err) {
console.error("Failed to start VitalLens:", err);
}
})();
</script>When creating a new VitalLens instance, you can configure various options:
| Parameter | Description | Default |
|---|---|---|
method |
Inference method: vitallens, g, chrom, or pos. |
vitallens |
apiKey |
API key for the VitalLens API (required for method vitallens). |
null |
globalRoi |
Optional region of interest for face detection (object with { x0, y0, x1, y1 }). |
undefined |
waveformMode |
Optional setting how waveform is returned: incremental, windowed, or complete. |
(see below) |
overrideFpsTarget |
Override for the target frames per second (fps) used during inference. | undefined |
fDetFs |
Frequency (in Hz) at which face detection should be performed. | 1 |
The default value for waveformMode is windowed if a stream is being analyzed, and complete if a file is being processed.
You can choose from several rPPG methods:
vitallens: The recommended method. Uses the VitalLens API and automatically selects the best model for your API key (e.g., VitalLens 2.0 with HRV support)vitallens-2.0: Forces the use of the VitalLens 2.0 model.vitallens-1.0: Forces the use of the VitalLens 1.0 model.vitallens-1.1: Forces the use of the VitalLens 1.1 model.pos,chrom,g: Classic rPPG algorithms that run locally and do not require an API key.
When analyzing a video stream, VitalLens returns estimation results continuously. Each returned estimation result contains the following vital signs:
| Name | Type | Based on / containing | Returned if |
|---|---|---|---|
ppg_waveform |
Continuous waveform | Depends on waveformMode |
Always |
heart_rate |
Global value | Up to last 10 seconds | Face present for at least 5 seconds |
respiratory_waveform |
Continuous waveform | Depends on waveformMode |
Using vitallens, vitallens-1.0, vitallens-1.1, or vitallens-2.0 |
respiratory_rate |
Global value | Up to last 30 seconds | Face present for at least 10 seconds using vitallens, vitallens-1.0, vitallens-1.1, or vitallens-2.0 |
hrv_sdnn |
Global value | Up to last 60 seconds | Face present for at least 20 seconds using vitallens, or vitallens-2.0 |
hrv_rmssd |
Global value | Up to last 60 seconds | Face present for at least 20 seconds using vitallens, or vitallens-2.0 |
hrv_lfhf |
Global value | Up to last 60 seconds | Face present for at least 55 seconds using vitallens, or vitallens-2.0 |
When analyzing a video file, VitalLens returns one estimation result for the entire file, containing:
| Name | Type | Based on / containing | Returned if |
|---|---|---|---|
ppg_waveform |
Continuous waveform | Depends on waveformMode |
Always |
heart_rate |
Global value | Entire video | Video is at least 5 seconds long |
respiratory_waveform |
Continuous waveform | Depends on waveformMode |
Using vitallens, vitallens-1.0, vitallens-1.1, or vitallens-2.0 |
respiratory_rate |
Global value | Entire video | Video is at least 10 seconds long using vitallens, vitallens-1.0, vitallens-1.1, or vitallens-2.0 |
hrv_sdnn |
Global value | Entire video | Face present for at least 20 seconds using vitallens, or vitallens-2.0 |
hrv_rmssd |
Global value | Entire video | Face present for at least 20 seconds using vitallens, or vitallens-2.0 |
hrv_lfhf |
Global value | Entire video | Face present for at least 55 seconds using vitallens, or vitallens-2.0 |
The library returns vital sign estimates in a structured object. vitallens.js is designed to process only a single face — so you always receive a single result object with the following structure:
export interface VitalLensResult {
face: {
// Detected face coordinates for each frame, formatted as [x0, y0, x1, y1].
coordinates: Array<[number, number, number, number]>;
// Confidence values for the face per frame.
confidence: number[];
// An explanatory note regarding the face detection.
note: string;
};
vital_signs: {
// Estimated global heart rate.
heart_rate: {
// Estimated heart rate value.
value: number;
// Unit of the heart rate value.
unit: string;
// Overall confidence of the heart rate estimation.
confidence: number;
// An explanatory note regarding the estimation.
note: string;
};
// Other vitals...
};
// A list of timestamps (one per processed frame).
time: number[];
// The frames per second (fps) of the input video.
fps: number;
// The effective fps used for inference.
estFps: number;
// A message providing additional information about the estimation.
message: string;
}When using the core VitalLens class, you are responsible for managing the instance's lifecycle. This involves controlling the stream and listening for events.
You can control a live video stream at any time using these methods:
vl.startVideoStream(): Starts or resumes processing.vl.pauseVideoStream(): Pauses processing. The webcam stays on, but no new data is sent.vl.stopVideoStream(): Stops processing, stops the webcam, and clears all internal buffers.
// Example: A simple pause/resume button
let isProcessing = true;
myButton.onclick = () => {
if (isProcessing) {
vl.pauseVideoStream();
myButton.textContent = 'Resume';
} else {
vl.startVideoStream();
myButton.textContent = 'Pause';
}
isProcessing = !isProcessing;
};Your application should listen for events to receive data and handle errors.
vitals(On Success) This event fires continuously during a stream whenever a new vital sign packet is ready.
vitallens.addEventListener('vitals', (result) => {
console.log('Vitals:', result);
// Update your UI here
});-
streamReset(On Recoverable Error) This event fires if the network becomes too unstable. The library logs aVitalLensAPIErrorfor debugging, stops the stream to prevent a crash, and emits this event. Your application should listen for this to handle the reset.Best Practice: Listen for this event, notify the user, and automatically restart the stream.
vitallens.addEventListener('streamReset', (eventData) => {
console.warn('Stream was reset:', eventData.message);
// 1. Notify user and stop the old stream
showMyErrorPopup('Connection unstable. Reconnecting...');
vl.stopVideoStream();
// 2. Wait 3 seconds and restart
setTimeout(async () => {
try {
const stream = await navigator.mediaDevices.getUserMedia({ video: true });
videoElement.srcObject = stream;
await vl.setVideoStream(stream, videoElement);
vl.startVideoStream();
hideMyErrorPopup();
} catch (err) {
console.error('Failed to restart stream:', err);
showMyErrorPopup('Could not reconnect. Please try again manually.');
}
}, 3000);
});fileProgress(File Processing) This event fires multiple times when callingprocessVideoFileto provide text updates on the processing stages.
vitallens.addEventListener('fileProgress', (message) => {
console.log('File progress:', message); // e.g., "Detecting faces..."
showLoadingSpinner(message);
});Before running any of the examples, make sure to build the project by executing:
npm run buildAlso, note that each example requires an API key. Replace YOUR_API_KEY with your actual API key when running the examples.
-
Browser - Vitals Monitor: examples/browser/vitals_monitor.html
To run this example, execute:API_KEY=YOUR_API_KEY npm run start:browser
-
Browser - Advanced Widget: examples/browser/widget.html
To run this example, execute:API_KEY=YOUR_API_KEY npm run start:browser-widget
-
Browser - Minimal File Input: examples/browser/file_minimal.html
To run this example, execute:API_KEY=YOUR_API_KEY npm run start:browser-file-minimal
-
Browser - Minimal Webcam Input: examples/browser/webcam_minimal.html
To run this example, execute:API_KEY=YOUR_API_KEY npm run start:browser-webcam-minimal
-
Node - File Processing: examples/node/file.js
To run this example, execute:API_KEY=YOUR_API_KEY npm run start:node-file
Try opening the HTML examples in your browser or running the Node script to see vitallens.js in action.
-
Error in Chrome:
Refused to cross-origin redirects...This error occurs for some browsers when opening the HTML file directly from your computer, because browser security policies prevent advanced features from running in HTML files opened directly from your computer. Solution: Serve your HTML file from a local web server. In your file's directory, run:npx serve
-
When to Use the Self-Contained Library Use the self-contained library if your app must run in an environment that blocks requests to public CDNs (like an offline app or behind a corporate firewall). Warning: This file is large and will significantly slow initial page load.
<script src="https://cdn.jsdelivr.net/npm/[email protected]/dist/vitallens.browser.selfcontained.js"></script>
For security reasons, we recommend that you do not expose your API key directly in client-side code. There are two primary approaches to secure your API key:
If you are building a server-side application using Node.js, your API key remains securely on your server. Simply call the API directly from your backend code without exposing your credentials.
If you need to use vitallens.js in a browser, you can set up a proxy server. The proxy server receives requests from the client, attaches your API key (stored securely on the server), and forwards the request to the VitalLens API. This way, the API key is never exposed to the client.
Our client library supports this by accepting a proxyUrl option. For example:
import { VitalLens } from 'vitallens';
const vl = new VitalLens({
method: 'vitallens',
proxyUrl: 'https://your-proxy-server.com/api' // URL to your deployed proxy server
});Or when using one of our widgets:
<vitallens-widget proxy-url="https://your-proxy-server.com/api"></vitallens-widget>Below is a simple Node.js/Express proxy server implementation that you can use as a starting point:
const express = require('express');
const bodyParser = require('body-parser');
const cors = require('cors');
const app = express();
const PORT = process.env.PORT || 3000;
// Securely store your API key in an environment variable
const API_KEY = process.env.VITALLENS_API_KEY;
const VITALLENS_ENDPOINT = 'https://api.rouast.com/vitallens-v3/file';
app.use(bodyParser.json({ limit: '10mb' }));
// Enable CORS for your allowed domain.
app.use(cors({
origin: 'http://example.com', // Your allowed domain
methods: ['GET', 'POST', 'OPTIONS'],
allowedHeaders: ['Content-Type', 'Authorization']
}));
app.post('/', async (req, res) => {
try {
const response = await fetch(VITALLENS_ENDPOINT, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'x-api-key': API_KEY,
},
body: JSON.stringify(req.body),
});
const data = await response.text();
res.status(response.status).send(data);
} catch (error) {
console.error('Proxy error:', error);
res.status(500).send('Internal server error');
}
});
app.listen(PORT, () => {
console.log(`Proxy server listening on port ${PORT}`);
});You can deploy this proxy server on any Node.js hosting platform (such as Heroku, Vercel, or your own server) and then set the URL as the proxyUrl in your VitalLens client configuration.
To build the project from source, run:
npm run buildThis compiles the TypeScript source and bundles the output for Node (both ESM and CommonJS), and the browser.
Execute the test suite with:
npm run testFor environment-specific tests, you can use:
npm run test:browser
npm run test:node
npm run test:browser-integration
npm run test:node-integrationRun specific tests:
npx jest test/core/VitalLens.browser.test.tsLint the code using:
npm run lintImportant: vitallens.js provides vital sign estimates for general wellness purposes only. It is not intended for medical use. Always consult a healthcare professional for any medical concerns or precise clinical measurements.
Please review our Terms of Service and Privacy Policy for more details.
This project is licensed under the MIT License.
