Deploy Next.js SSR to Cloudflare Workers with GitHub Actions

Authors
  • avatar
    Name
    Hamza Rahman
Published on
-
5 mins read

I built a small nutrition app, FastFoodCalc, and hit a classic problem. I needed real SSR so search engines could index 2,000+ menu items, while keeping the app fast for users. Next.js made SSR easy. Deployment was… not.

I tried a few paths, tripped on the usual edges, and ended up with a simple, boring (in a good way) flow: build on CI with GitHub Actions and deploy to Cloudflare Workers.

Here’s what broke, what worked, and the setup I shipped.

The Goal: SSR Without the Headaches

My needs were simple:

  1. Real SSR: Serve real HTML for 2,000+ dynamic pages so search engines could see my content.
  2. Fast Performance: Users should get a snappy, client-side experience.
  3. Reliable Deploys: I wanted to git push and walk away, not futz with local environment issues.

The Traps: What I Tried First

Before I found the right flow, I fell into the two most common traps.

Trap 1: The Cloudflare Pages 404

Cloudflare Pages has a slick GitHub integration, so I tried it first. It felt right. I connected my repo, the build ran, and... 404.

Here's the "gotcha": OpenNext (the tool that packages Next.js for non-Vercel platforms) builds for Cloudflare Workers, not Pages. The build output was a worker.js file and an assets/ directory. Cloudflare Pages, however, expects either a static site or its own _worker.js Functions format. The result: a pretty dashboard and a broken site.

Trap 2: The "It Works on My Machine" Nightmare

Fine, I thought, I'll just deploy from my local machine. This is the path of pain.

On my Windows machine, I hit weird pathing and ESM loader errors. On a co-worker's Mac, it mostly worked. When I tried it in WSL (Windows Subsystem for Linux) from a mounted /mnt/d/ drive, the build would sometimes crash with a SIGBUS error.

This is the classic "works on my machine" problem. It's fragile, depends on your OS, and is impossible to debug. A deploy process you can't trust is worse than no deploy process at all.

The Winning Stack (And Why It Works)

I stepped back and defined what would work:

  1. A build target that natively understands the OpenNext output.
  2. A build environment that is consistent, clean, and OS-agnostic.

This led me to the final, simple stack.

1. Cloudflare Workers for SSR (The Target)

This was the "aha!" moment. OpenNext is designed to output a format that Cloudflare Workers understands perfectly.

My wrangler.toml (Cloudflare's config file) just needed to point to the OpenNext build output:

# wrangler.toml
name = "fastfoodcalc"
main = ".open-next/worker.js"
compatibility_flags = ["nodejs_compat"]
[assets]
directory = ".open-next/assets"
binding = "ASSETS"

That’s it. The worker entry and assets map 1:1 to OpenNext’s output. The nodejs_compat flag unlocks Node APIs Next.js relies on.

2. GitHub Actions for Builds (The Environment)

This solved the "works on my machine" problem. By building on ubuntu-latest, I get the same clean environment on every push.

  • No Windows path issues or WSL quirks
  • No local node_modules drift
  • Secrets live in GitHub, not on laptops
  • Anyone can deploy by pushing to main

The pipeline lives in .github/workflows/deploy.yml:

name: Deploy to Cloudflare Workers
on:
push:
branches: [ main ]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Setup Node.js v20
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- name: Install dependencies
run: npm ci
- name: Deploy to Cloudflare
run: npm run deploy
env:
CLOUDFLARE_API_TOKEN: ${{ secrets.CLOUDFLARE_API_TOKEN }}
CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}

Setup is three steps:

  1. Create a Cloudflare API token (Workers → Edit template).
  2. Copy your Account ID from the Cloudflare dashboard.
  3. Add both as GitHub repo secrets.

Deploy flow:

git add .
git commit -m "feat: update menu items"
git push origin main

Watch the workflow run; a couple of minutes later, it’s live.

Why I Picked Cloudflare (not Vercel)

Vercel is excellent, but for this project Cloudflare fit better:

  • Generous free tier (≈100k requests/day) for a growing SSR app
  • Global edge Workers: fast cold starts, responses close to users
  • Clean SSR + assets mapping from OpenNext

If your app leans on Vercel-specific features, that’s a great path too. For FastFoodCalc, Cloudflare’s speed + free quota won.

OS‑Specific Note (Windows)

  • Windows paths can trigger ESM loader errors; WSL on mounted drives (e.g., /mnt/d/...) may throw SIGBUS.
  • Prefer CI builds. If building locally, use WSL’s native FS (e.g., ~/projects/...).

The Result

FastFoodCalc now renders SSR at the edge and hydrates instantly on the client. Search engines see real HTML for all 2,000+ items, and users get a fast app.

The best part: the deploy flow is boring in the best way. Make a change, push, done. Less ceremony, more shipping.