This is a video streaming backend built using Node.js, Express, and TypeScript. It processes uploaded videos by transcoding them into multiple resolutions and segmenting them for adaptive bitrate streaming (HLS). The processed files are stored in a Cloudflare R2 bucket, which is compatible with the AWS S3 API.
- The user uploads a video through an authenticated API request.
- The backend receives the file and uses FFmpeg to transcode it into multiple resolutions (for example, 720p, 480p, 360p).
- Each resolution is segmented into smaller
.tschunks using HLS (HTTP Live Streaming). - A
manifest.m3u8file is generated that lists the segment URLs. - All files are uploaded to Cloudflare R2 storage.
- The manifest can be used by a video player to stream the video adaptively based on network speed.
- The user requests a video upload.
- The backend generates and returns a pre-signed URL for direct upload.
- The user directly uploads the video file to Cloudflare R2 using the pre-signed URL.
- A Cloudflare Worker is triggered when the upload completes, and it notifies the backend.
- The backend processes the uploaded file using FFmpeg (transcoding + HLS segmentation).
- Processed files and the manifest are uploaded back to Cloudflare R2.
- The manifest URL is returned to the client for adaptive playback.
| Component | Description |
|---|---|
| Language | TypeScript |
| Framework | Express.js (Node.js) |
| ORM | Prisma |
| Database | PostgreSQL |
| Video Processing | FFmpeg |
| Cloud Storage | Cloudflare R2 (S3 compatible) |
| Authentication | JWT-based middleware |
HLS (HTTP Live Streaming):
HLS is a streaming protocol that splits video files into smaller segments and provides a manifest file (.m3u8) listing those segments. This enables adaptive bitrate streaming.
Adaptive Bitrate Streaming (ABR): ABR allows the video player to automatically switch between different quality levels (resolutions) depending on the user's internet speed.
FFmpeg: FFmpeg is a command-line tool used for video processing. In this project, it transcodes uploaded videos into multiple resolutions and segments them for HLS.
To run the server locally:
-
Clone the repository
git clone https://github.com/sanjanaynvsdl/videostreaming-api.git -
Move into the server directory
cd server -
Install dependencies
npm install -
Build the TypeScript project
npm run build -
Start the server
npm start
Make sure you have FFmpeg installed and your .env file configured with database and Cloudflare R2 credentials.
