Skip to content

Live camera view

This example walks through fetching a user’s cameras via the API and embedding a live HLS stream.

const TOKEN = 'YOUR_PERSONAL_ACCESS_TOKEN'; // or a Bearer token from OAuth2
async function getCameras() {
const res = await fetch('https://api.angelcam.com/v1/cameras/', {
headers: { Authorization: `PersonalAccessToken ${TOKEN}` },
});
const data = await res.json();
return data.results;
}

Each camera object contains an id, name, type (h264, h265, or mjpeg), and a snapshot URL for a preview image.

For H.264/H.265 cameras request the HLS stream. For MJPEG cameras an MJPEG stream is the only option.

async function getLiveStream(cameraId) {
const res = await fetch(`https://api.angelcam.com/v1/cameras/${cameraId}/streams/`, {
headers: { Authorization: `PersonalAccessToken ${TOKEN}` },
});
const { streams } = await res.json();
const hls = streams.find(s => s.format === 'hls');
const mjpeg = streams.find(s => s.format === 'mjpeg');
return { hls: hls?.url, mjpeg: mjpeg?.url };
}

Use hls.js for broad browser support.

<video id="player" controls autoplay muted></video>
<script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script>
<script>
async function startLiveView(cameraId) {
const { hls: hlsUrl } = await getLiveStream(cameraId);
const video = document.getElementById('player');
if (Hls.isSupported()) {
const hls = new Hls();
hls.loadSource(hlsUrl);
hls.attachMedia(video);
} else if (video.canPlayType('application/vnd.apple.mpegurl')) {
// Native HLS (Safari)
video.src = hlsUrl;
}
}
startLiveView('CAMERA_ID');
</script>

MJPEG streams work as a plain <img> src — no player library needed.

async function startMjpegView(cameraId) {
const { mjpeg: mjpegUrl } = await getLiveStream(cameraId);
document.getElementById('mjpeg-view').src = mjpegUrl;
}
async function init() {
const cameras = await getCameras();
for (const cam of cameras) {
const { hls: hlsUrl, mjpeg: mjpegUrl } = await getLiveStream(cam.id);
if (hlsUrl) {
renderHlsPlayer(cam.name, hlsUrl);
} else if (mjpegUrl) {
renderMjpegPlayer(cam.name, mjpegUrl);
}
}
}
function renderHlsPlayer(name, url) {
const video = document.createElement('video');
video.controls = true;
video.autoplay = true;
video.muted = true;
document.body.appendChild(Object.assign(document.createElement('h3'), { textContent: name }));
document.body.appendChild(video);
const hls = new Hls();
hls.loadSource(url);
hls.attachMedia(video);
}
function renderMjpegPlayer(name, url) {
const img = document.createElement('img');
img.src = url;
img.alt = name;
document.body.appendChild(Object.assign(document.createElement('h3'), { textContent: name }));
document.body.appendChild(img);
}
  • Each camera supports up to 10 concurrent consumers. Your Angelcam web/mobile sessions count toward this limit.
  • Use snapshots for thumbnail previews — they don’t count toward the concurrency limit.
  • For public or high-concurrency use cases, look at the Broadcasting service instead.