- Published on
Video Processor Project: Building a Media Processing Platform
- Authors

- Name
- Pulathisi Kariyawasam
- @RandhanaK
Introduction
The Video Processor project is a full-stack media-processing platform I built to handle video conversion, compression, trimming, audio extraction, and GIF generation. It consists of a Node.js backend with Express and Sequelize, and a React frontend served via Nginx. This post covers the architecture, features, endpoints, security considerations, and deployment workflow.
Project Structure
Repositories and Folders
ffmpeg-api/— Backend (Node.js + Express + Sequelize)video-processor-frontend/— Frontend (React SPA)docker-compose.yml— Production-oriented compose
High-Level Overview
The platform exposes HTTP APIs for:
- Video format conversion
- Compression
- Trimming
- Audio extraction
- GIF generation
Backend handles user management, API key authentication, usage tracking, request quotas, and optional 2FA via TOTP. Frontend provides a SPA for file uploads, conversion tools, and profile management.
Backend: ffmpeg-api
Key Components
package.json— dependencies: express, sequelize, pg, ffmpeg, speakeasy, qrcode, jwt, multer, bcryptapp.js— entry point, route mounting, CORS setup, database synccontrollers/— convert, compress, trim, extractAudio, gif, metadata, 2FAroutes/auth.js— user registration, login, account management, 2FA endpointsmodels/User.js,models/Usage.js— Sequelize modelsmiddleware/validateApiKey.js— API key validationutils/checkAndCreateUsage.js— daily request limit enforcement
Features
- User registration, email confirmation
- API key generation and cookie-based authentication
- 2FA via TOTP (speakeasy) with QR code setup
- Media processing endpoints:
/api/convert,/api/compress,/api/trim/api/extract-audio,/api/gif,/api/metadata
- Usage tracking per user and per endpoint
- Tiered request limits (
free,basic,premium) - Pricing and account management endpoints
Data Models
User Model
id,email,name,api_key,tier,is_active,confirmed,confirmationToken,lastUsed,password,is2fa_enabled,totp_secret,is2fa_confirmed- Passwords hashed automatically; API key generated on create
Usage Model
id,user_id,endpoint,file_size,processing_time,status,error_message,ip_address,user_agent,last_reset,request_count
Security Practices
- Password hashing with bcrypt
- API key authentication via headers, query params, or cookies
- 2FA via TOTP, QR code generation; secrets stored in DB
- Cookies set as httpOnly and secure in production
- CORS configured for frontend origin
- Per-tier request limits enforced
Frontend: video-processor-frontend
Key Files
package.json— React, axios, react-router, react-reduxnginx.conf— serves SPA, proxies/api/to backendDockerfile— multi-stage build: Node build + Nginx servesrc/services/apiConfig.js— axios instance, environment-based base URLsrc/pages/Profile/ProfilePage.jsx— profile management and 2FA UI
Features
- Upload and process files via SPA
- Enable/disable 2FA, confirm TOTP on profile page
- Axios API integration with
x-api-keyheader - Supports environment variable injection for API base URL
- Nginx handles static caching and API proxying
2FA Flow
The two-factor authentication (2FA) flow in the Video Processor platform follows these steps:
User requests 2FA setup
Endpoint:POST /api/auth/2fa/setup(user must be authenticated)Backend generates TOTP secret
- Stores secret in the database
- Sets flags:
is2fa_enabled = true,is2fa_confirmed = false
QR code returned to client
- Backend returns a QR code data URL for setup in an authenticator app
User scans QR and submits TOTP code
Endpoint:POST /api/auth/2fa/confirmBackend verifies TOTP code
- On success, sets
is2fa_confirmed = true - Ensures 2FA is now active for the user
- On success, sets
Deployment and Docker
For the Video Processor project, the production deployment was set up using Docker and automated with GitHub Actions. The workflow ensured that the backend, frontend, and database migrations were properly deployed to the production server with minimal manual intervention.
Deployment Overview
- The latest code and database migrations were pushed to the GitHub repository.
- A GitHub Actions workflow automatically built Docker images for the backend and frontend.
- The workflow pulled the pre-built images and started the containers on the production server using
docker-compose.yml. - Database migrations were executed inside the backend container to ensure the schema was up-to-date.
- Nginx was used to serve the React frontend and proxy API requests to the backend container.
- SSL certificates and reverse proxy configuration were managed through dedicated volumes on the server.
- Logs and container status were monitored to verify that the deployment completed successfully.
- The process ensured that the production environment always had the latest features, security updates, and database consistency without manual intervention.