When I first built enjoymovie.live, I wasn’t trying to create some hyperscale system like Netflix. I just wanted a simple streaming site. But then traffic grew — and suddenly I was serving over 1,500 users per minute.
The crazy part? I pay $0 for infrastructure. No servers, no bandwidth bills, no scaling headaches. That, to me, is the real achievement. Not just handling traffic — but handling it for free.
The Philosophy: Remove Work Until Nothing Is Left
Most developers panic when traffic grows. They spin up servers, configure databases, add Kubernetes clusters, and watch costs spiral. I took the opposite approach: I stripped away everything unnecessary until there was nothing left to scale. The CDN does all the heavy lifting.
This is the mindset: don’t handle traffic yourself, make the edge handle it.
Step 1: My Origin Barely Exists
The “server” behind enjoymovie.live is just static files hosted on Cloudflare Pages. My build outputs:
- HTML files for every page
- Static JSON with movie metadata
- Hashed JS and CSS bundles
- Poster and thumbnail images
No PHP, no Node, no databases. In practice, 98% of requests never hit my origin because Cloudflare caches everything at the edge.
Step 2: Load Balancing Without Servers
I don’t run load balancers. Instead, traffic is automatically spread across Cloudflare’s 300+ global data centers. Users in Paris get content from Paris. Users in Delhi get it from Delhi. Los Angeles users are served in LA. That’s real load balancing — without me managing anything.
Example: On a Saturday evening spike, Cloudflare analytics showed:
- North America: 4,500 requests (served from 8 edge POPs)
- Europe: 3,000 requests (served from 6 edge POPs)
- Asia: 4,000 requests (served from 10 edge POPs)
- Origin: fewer than 200 requests total
This is what my architecture looks like in practice:
User ---> Nearest Cloudflare Edge ---> Cache Hit | v (if cache miss) | v Cloudflare R2 (storage) | v Edge caches result
I don’t pay for servers, yet my site feels global.
Step 3: Workers Keep Traffic Smart
Cloudflare Workers act as traffic managers. They normalize requests and keep the cache clean. Example:
/movie/123?utm_source=facebook
/movie/123?utm_source=twitter
Both are the same movie, so my Worker strips out utm_source
. Result: one cache object instead of many. Multiply that across thousands of users, and my origin load drops to near zero.
Step 4: Video Streaming for $0
Streaming usually kills projects because bandwidth costs explode. But I solved this by leaning entirely on the CDN:
- HLS segments cached at the edge: A movie is just
.m3u8
playlists +.ts
chunks. Once the first viewer loads them, they’re cached at the nearest edge. The next 500 viewers get them for free. - Free storage with R2: Cloudflare R2 gives me free egress to Cloudflare’s CDN, meaning I pay nothing to serve videos worldwide.
- Embeds for some titles: For certain content, I serve only the player shell and let external infra handle the bandwidth.
Real Example: One movie got 500+ simultaneous viewers. The first requests pulled ~200MB from R2. After that, every segment was served directly from cache. My bill? $0.
Step 5: No Databases, Just JSON
Things like search, trending, and catalogs feel dynamic but are actually static:
- A JSON index (
movies.json
) is generated at build time. - Search happens in the browser (client-side).
- Trending counters use KV but are cached at the edge for 5 minutes.
No SQL, no MongoDB, nothing to scale or maintain.
Step 6: Frontend Discipline Matters
The frontend helps scaling by using less bandwidth:
- Images are lazy-loaded (
loading="lazy"
). - I preload the first movie page for instant feel.
- The app is a SPA, so JS/CSS are fetched once.
- Responsive
srcset
ensures phones don’t fetch 4K posters.
Less waste per user = better scale.
Step 7: The Real Achievement — Scaling for Free
Handling 1.5k users/minute is impressive. But the real achievement is doing it at zero cost. Here’s what my stack looks like:
- Hosting: Cloudflare Pages (free)
- CDN: Cloudflare global edge (free, unmetered caching)
- Workers: Millions of requests covered by free tier
- Storage: Cloudflare R2 with free egress
- Database needs: KV/D1 free tier
- Analytics: Cloudflare Analytics (free)
Now compare this to AWS/GCP pricing:
- AWS CloudFront + S3: ~$100+ for 1TB video streaming
- Google Cloud CDN + Storage: ~$80–120 for the same
- My stack: $0
That’s the victory: I scaled like Netflix without paying Netflix’s bills.
Final Thoughts
Scalability isn’t always about adding servers. Sometimes it’s about removing work until there’s nothing left to scale. That’s how I run enjoymovie.live at 1,500 users per minute, with no infrastructure bills at all.