What We're Building
By the end of this tutorial, you'll have a single-page web app that opens your webcam, captures a video frame, sends it to the Beam AI rPPG API, and displays your heart rate in BPM. The entire client is under 60 lines of JavaScript.
The underlying technology is remote photoplethysmography (rPPG) — detecting the subtle skin color changes caused by blood flow through facial capillaries. The Beam AI API handles all the hard parts: face detection, region-of-interest extraction, signal processing, and noise filtering. You just send a frame and get a number back.
Prerequisites
- A device with a webcam (laptop, phone, tablet)
- A Beam AI API key — get one here (free, takes 30 seconds)
- Basic JavaScript knowledge
Want to see the end result first? The live demo runs the full pipeline from your webcam — no signup required. Come back here when you're ready to build your own.
1 Set Up the HTML
Create an index.html file. We need three elements: a video feed for the webcam preview, a canvas (hidden) to capture frames, and a display area for the results.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Heart Rate Monitor</title>
<style>
body { font-family: sans-serif; text-align: center; padding: 2rem; }
video { width: 320px; border-radius: 8px; }
#bpm { font-size: 3rem; font-weight: bold; margin: 1rem 0; }
button { padding: 12px 24px; font-size: 1rem; cursor: pointer; }
</style>
</head>
<body>
<h1>Webcam Heart Rate Monitor</h1>
<video id="webcam" autoplay playsinline></video>
<canvas id="frame" style="display:none"></canvas>
<div id="bpm">-- BPM</div>
<div id="confidence"></div>
<button onclick="measureHeartRate()">Measure Heart Rate</button>
<script>
// We'll fill this in next
</script>
</body>
</html>
2 Capture the Webcam Feed
Use getUserMedia to request camera access. We'll stream the video into the <video> element so the user gets a live preview of themselves.
const video = document.getElementById('webcam');
const canvas = document.getElementById('frame');
const ctx = canvas.getContext('2d');
async function startCamera() {
const stream = await navigator.mediaDevices.getUserMedia({
video: { facingMode: 'user', width: 640, height: 480 }
});
video.srcObject = stream;
}
startCamera();
We request 640×480 resolution. Higher resolution doesn't improve accuracy for rPPG — the signal comes from average color values across skin regions, not pixel-level detail. 640×480 keeps the payload small and the API response fast.
3 Capture a Frame and Send It to the API
When the user clicks "Measure Heart Rate," we draw the current video frame onto the hidden canvas, export it as a base64 JPEG, and POST it to the Beam AI API.
const API_KEY = 'YOUR_API_KEY'; // from /keys
async function measureHeartRate() {
// Capture current frame
canvas.width = video.videoWidth;
canvas.height = video.videoHeight;
ctx.drawImage(video, 0, 0);
const frameData = canvas.toDataURL('image/jpeg', 0.8);
document.getElementById('bpm').textContent = 'Measuring...';
try {
const response = await fetch('https://api.beamai.co/v1/analyze', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${API_KEY}`
},
body: JSON.stringify({ frame: frameData })
});
const data = await response.json();
if (data.success) {
const hr = data.vitals.heart_rate;
const conf = data.vitals.confidence;
document.getElementById('bpm').textContent = `${hr} BPM`;
document.getElementById('confidence').textContent =
`Confidence: ${Math.round(conf * 100)}%`;
} else {
document.getElementById('bpm').textContent = 'Try again';
}
} catch (err) {
document.getElementById('bpm').textContent = 'Error';
console.error('API call failed:', err);
}
}
The API returns a JSON response with the same structure you see in the quickstart docs. The key fields:
vitals.heart_rate— BPM as an integer (typical resting range: 60-100)vitals.confidence— 0.0 to 1.0 indicating measurement qualityvitals.hrv— heart rate variability in millisecondsvitals.spo2— estimated blood oxygen percentagevitals.stress_index— Baevsky stress indexvitals.respiration_rate— breaths per minute
4 The Complete App
Here's everything assembled into one file. Copy this, replace YOUR_API_KEY, and open it in a browser.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Heart Rate Monitor</title>
<style>
body {
font-family: system-ui, sans-serif;
max-width: 480px; margin: 2rem auto;
text-align: center; background: #111; color: #eee;
}
video { width: 100%; border-radius: 12px; margin: 1rem 0; }
#bpm { font-size: 4rem; font-weight: 700; color: #00d4aa; }
#confidence { color: #888; margin-bottom: 1rem; }
button {
padding: 14px 32px; font-size: 1rem;
background: #00d4aa; color: #111; border: none;
border-radius: 8px; cursor: pointer; font-weight: 600;
}
button:hover { opacity: 0.9; }
button:disabled { opacity: 0.5; cursor: not-allowed; }
</style>
</head>
<body>
<h1>Heart Rate Monitor</h1>
<video id="webcam" autoplay playsinline></video>
<canvas id="frame" style="display:none"></canvas>
<div id="bpm">-- BPM</div>
<div id="confidence">Point your face at the camera</div>
<button id="btn" onclick="measureHeartRate()">
Measure Heart Rate
</button>
<script>
const API_KEY = 'YOUR_API_KEY';
const video = document.getElementById('webcam');
const canvas = document.getElementById('frame');
const ctx = canvas.getContext('2d');
navigator.mediaDevices.getUserMedia({
video: { facingMode: 'user', width: 640, height: 480 }
}).then(stream => { video.srcObject = stream; });
async function measureHeartRate() {
const btn = document.getElementById('btn');
btn.disabled = true;
btn.textContent = 'Measuring...';
document.getElementById('bpm').textContent = '...';
document.getElementById('confidence').textContent = 'Hold still';
canvas.width = video.videoWidth;
canvas.height = video.videoHeight;
ctx.drawImage(video, 0, 0);
const frameData = canvas.toDataURL('image/jpeg', 0.8);
try {
const res = await fetch('https://api.beamai.co/v1/analyze', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${API_KEY}`
},
body: JSON.stringify({ frame: frameData })
});
const data = await res.json();
if (data.success) {
document.getElementById('bpm').textContent =
data.vitals.heart_rate + ' BPM';
document.getElementById('confidence').textContent =
'Confidence: ' + Math.round(data.vitals.confidence * 100) + '%';
} else {
document.getElementById('bpm').textContent = 'No reading';
document.getElementById('confidence').textContent =
'Ensure good lighting and stay still';
}
} catch (e) {
document.getElementById('bpm').textContent = 'Error';
document.getElementById('confidence').textContent = e.message;
}
btn.disabled = false;
btn.textContent = 'Measure Again';
}
</script>
</body>
</html>
Tips for Better Readings
The accuracy of rPPG vs contact sensors depends heavily on conditions. A few things that make a measurable difference:
- Lighting: Even, diffuse indoor lighting is ideal. Avoid backlighting, direct sunlight, and flickering fluorescents.
- Stillness: Head movement is the primary accuracy killer. Have the user sit still and look at the camera for 10-15 seconds before capturing.
- Framing: The face should fill at least 30% of the frame. Too far away reduces the signal-to-noise ratio.
- Confidence gating: Don't display readings with confidence below 0.80. Prompt the user to retry instead.
Production tip: For real apps, capture multiple frames over 15-30 seconds instead of a single snapshot. Average the results and discard outliers. The quickstart guide covers the multi-frame pattern and error handling in detail.
What's Next
You now have a working heart rate monitor in under 80 lines of HTML. From here, you can extend it in several directions:
- HRV tracking: Use
vitals.hrvto build a stress/recovery dashboard. HRV is the single most informative metric for autonomic nervous system state. - Continuous monitoring: Call the API on an interval (every 30 seconds) to build a time-series chart of heart rate over a session.
- Stress detection: The
vitals.stress_index(Baevsky stress index) provides a direct indicator of physiological stress — useful for meditation apps, workplace wellness tools, or gaming biofeedback. - Multi-vital display: Show heart rate, SpO2, respiration rate, and HRV together for a complete vital signs dashboard.
The API handles all the signal processing. Every feature above is an additional field in the same response you're already getting — no extra integration work.
Ready to Build?
See the API running live on your own face, or grab a key and start building. From zero to heart rate in five minutes.
Frequently Asked Questions
Can I build a heart rate monitor with just a webcam?
Yes. rPPG detects subtle skin color changes caused by blood flow using a standard webcam. The Beam AI API handles the signal processing — you send a video frame and get back a BPM reading. No special hardware required beyond a device with a camera.
How accurate is webcam-based heart rate detection?
Under good conditions (stable lighting, user seated and still), rPPG achieves 2-4 BPM mean absolute error compared to clinical pulse oximeters. Accuracy degrades with movement, poor lighting, or heavy video compression. The API returns a confidence score so you can gate readings below your quality threshold.
What does the API return besides heart rate?
Heart rate (BPM), heart rate variability (HRV in ms), blood oxygen estimation (SpO2), stress index, respiration rate, a confidence score, and processing metadata. All from a single API call with one video frame.
Do I need to process video on the client side?
No. You capture a single frame using a canvas element, convert it to base64, and send it to the API. All rPPG signal processing — face detection, ROI extraction, color channel analysis, and FFT — happens server-side. Your client code is under 50 lines.