Edge vs Node Runtimes - The Showdown Your Frontend Deserves
Picture this: It’s 2 AM, you’re sipping your fourth cup of coffee, and you’re deploying your shiny new React app. You hit deploy on Vercel, choose “Edge” because it sounds fast (and fast is good, right?), and boom - your app breaks in ways you didn’t know were possible.
Sound familiar? Yeah, we’ve all been there.
Let’s talk about Edge and Node runtimes - two different worlds that look similar but behave like chalk and cheese. By the end of this article, you’ll know exactly which one to choose and why your “it works on my machine” app might be failing in production.
The Analogy: The Restaurant Kitchen
Before we dive into the technical jargon, let’s use everyone’s favorite metaphor: food.
Imagine you’re running a restaurant:
-
Node.js is like running a full kitchen with professional-grade equipment. You can cook anything - complex sauces, soufflés, the works. But it takes time to heat up, requires skilled chefs, and costs a pretty penny to maintain.
-
Edge is like having a microwave in every customer’s car. Not as powerful, but blazing fast and right where you need it. You can’t cook a five-course meal, but you can reheat that pizza in seconds.
Both feed people. Both are valid. Just… differently.
What is Node.js Runtime?
Node.js is the OG of server-side JavaScript. It runs on traditional servers (or serverless functions) and uses the V8 engine (the same one that powers Chrome) outside the browser.
Where Node Lives
When you deploy to a traditional server or even most “serverless” platforms (like AWS Lambda), your code typically runs in Node.js. It has:
- Full access to the file system
- A full event loop
- The ability to make synchronous HTTP requests
- Support for virtually any npm package
When to Use Node
Node is your go-to when you need:
- Heavy database operations
- Complex backend logic
- WebSockets (in traditional deployments)
- File system access
- Integration with legacy systems
Code Sample: Node.js API Route
Here’s a typical API route in Next.js (which runs on Node by default):
// pages/api/users.js (Pages Router)
// or app/api/users/route.js (App Router)
import { db } from "@/lib/database";
export async function GET(request) {
// This runs on the server - Node.js environment
const users = await db.users.findMany();
// You can do heavy processing here
const processedUsers = users.map(user => ({
...user,
// CPU-intensive operation
score: calculateUserScore(user.activities),
}));
return Response.json({ users: processedUsers });
}
function calculateUserScore(activities) {
// Complex algorithm that might take some time
return activities.reduce((acc, act) => {
return acc + act.value * act.multiplier;
}, 0);
}
This works beautifully on Node because you’ve got all the time in the world (well, the server’s time) to crunch those numbers.
What is Edge Runtime?
Edge runtime is like Node’s lighter, speedier cousin who decided to live in 50 different cities simultaneously. Instead of your code running in one location, it runs on a distributed network of servers (the “edge”) closer to your users.
The Edge Players
Vercel Edge Functions
Vercel’s Edge runtime uses QuickJS - a tiny JavaScript engine - instead of V8. This means:
- Smaller bundle sizes
- Faster cold starts (nearly instant)
- More limited API (no
fs, nochild_process, etc.)
// app/api/hello/route.js - Edge Runtime
export const runtime = "edge";
export async function GET(request) {
// This runs at the edge - lightning fast!
// Getting user location (built into Edge!)
const country = request.geo?.country || "US";
const city = request.geo?.city || "Unknown";
return Response.json({
message: `Hello from ${city}, ${country}!`,
runtime: "edge",
});
}
Wait, did you see that? request.geo - Edge runtimes know where your user is! More on this magical ability later.
Cloudflare Workers
Cloudflare Workers use a custom V8 isolate (not QuickJS, interestingly). They run on Cloudflare’s massive global network - 300+ data centers in 100+ countries.
// workers.js - Cloudflare Worker
export default {
async fetch(request, env, ctx) {
// Get user's country from CF headers
const country = request.cf?.country || "US";
const city = request.cf?.city || "Unknown";
// A/B testing at the edge!
const bucket = Math.random() < 0.5 ? "A" : "B";
return new Response(
JSON.stringify({
message: `Hello from ${city}, ${country}!`,
experimentBucket: bucket,
runtime: "cloudflare-worker",
}),
{
headers: { "content-type": "application/json" },
}
);
},
};
The Architecture Behind the Magic
Traditional Server (Node)
User → Internet → Server (somewhere in us-east-1) → Database → Response
↓
150ms latency (maybe more!)
Your user’s request travels all the way to your server, processes, and travels back. Simple, but slow if your user is in Tokyo and your server is in Virginia.
Edge Runtime
User (Tokyo) → Edge Server (Tokyo) → Edge Server (Virginia) → Database
↓
Immediate Response!
↓
Data cached at edge for next user
The magic? Edge functions can:
- Cache responses globally
- Make smart routing decisions based on user location
- Run code closer to users (typically < 10ms latency)
- Spin up instantly (no cold starts like traditional serverless)
How Vercel Edge Works
┌─────────────────────────────────────────────┐
│ Vercel Edge Network │
├─────────────────────────────────────────────┤
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
│ │ Tokyo │ │ London │ │ NYC │ │
│ │ Edge │ │ Edge │ │ Edge │ │
│ └────┬────┘ └────┬────┘ └────┬────┘ │
│ │ │ │ │
│ └────────────┼────────────┘ │
│ ↓ │
│ ┌───────────────┐ │
│ │ Global Cache │ │
│ │ (KV Store) │ │
│ └───────┬───────┘ │
│ ↓ │
│ Your Backend/Database │
└─────────────────────────────────────────────┘
How Cloudflare Workers Work
┌─────────────────────────────────────────────┐
│ Cloudflare Global Network │
│ 300+ locations │
├─────────────────────────────────────────────┤
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
│ │ Tokyo │ │ Sydney │ │ Frankfurt│ │
│ │ PoP │ │ PoP │ │ PoP │ │
│ └────┬────┘ └────┬────┘ └────┬────┘ │
│ └────────────┼────────────┘ │
│ ↓ │
│ ┌───────────────┐ │
│ │ Durable │ │
│ │ Objects │ │
│ └───────┬───────┘ │
│ ↓ │
│ Workers KV / D1 │
└─────────────────────────────────────────────┘
PoP = Point of Presence (where the magic happens)
Use Cases: When to Use What
Edge Runtime is Perfect For:
1. A/B Testing at Scale
// Vercel Edge - A/B testing without the lag
export const runtime = "edge";
export function middleware(request) {
const response = NextResponse.next();
// Assign user to bucket
const bucket =
request.cookies.get("ab-bucket")?.value ||
(Math.random() < 0.5 ? "control" : "variant");
response.cookies.set("ab-bucket", bucket);
// Add header for downstream components
response.headers.set("x-ab-bucket", bucket);
return response;
}
No server round-trip needed. The edge handles it in milliseconds.
2. Personalization Based on Location
// Showing different content for different regions
export async function GET(request) {
const { geo } = request;
const country = geo?.country || "US";
const content = {
US: {
message: "Hey there, American friend! 🇺🇸",
currency: "USD",
showPromo: true,
},
UK: {
message: "G'day, British buddy! 🇬🇧",
currency: "GBP",
showPromo: true,
},
JP: {
message: "Konnichiwa, Japanese friend! 🇯🇵",
currency: "JPY",
showPromo: false,
},
}[country] || {
message: "Hello, world traveler! 🌍",
currency: "USD",
showPromo: true,
};
return Response.json(content);
}
3. API Request Proxying
// Transform API responses at the edge
export async function GET(request) {
// Fetch from your main API
const response = await fetch("https://api.yourbackend.com/products");
const data = await response.json();
// Transform at edge - reduce payload
const minified = data.products.map(product => ({
id: product.id,
name: product.name,
price: product.price,
}));
return Response.json(minified);
}
Node.js is Better For:
1. Complex Database Queries
// Multiple database calls, transactions, etc.
export async function GET(request) {
const userId = getUserId(request);
// Transaction - requires full Node.js
const result = await db.$transaction(async tx => {
const user = await tx.users.findUnique({
where: { id: userId },
include: { orders: true },
});
// Heavy computation
const analytics = await calculateUserAnalytics(user);
await tx.analytics.create({
data: analytics,
});
return { user, analytics };
});
return Response.json(result);
}
2. File System Operations
// Generate PDF on the fly
import PDFDocument from "pdfkit";
export async function POST(request) {
const data = await request.json();
// Need file system access - Node.js only
const doc = new PDFDocument();
// Write to temp file
const filename = `/tmp/report-${Date.now()}.pdf`;
const stream = fs.createWriteStream(filename);
doc.pipe(stream);
doc.fontSize(25).text("Your Report", 100, 100);
doc.fontSize(12).text(JSON.stringify(data, null, 2));
doc.end();
// Wait for write to complete
await new Promise(resolve => stream.on("finish", resolve));
// Read and return
const file = fs.readFileSync(filename);
return new Response(file, {
headers: { "Content-Type": "application/pdf" },
});
}
3. Third-party API Integration with Secrets
// Using secrets that shouldn't be exposed to client
export async function POST(request) {
const { email } = await request.json();
// Stripe integration - needs secret key
const stripe = require("stripe")(process.env.STRIPE_SECRET_KEY);
const customer = await stripe.customers.create({
email,
name: request.headers.get("x-user-name"),
});
return Response.json({ customerId: customer.id });
}
The Cold Hard Truth: Limitations
Edge Runtime Can’t Do:
| Feature | Node.js | Edge |
|---|---|---|
| File system access | ✅ | ❌ |
| Child processes | ✅ | ❌ |
setTimeout > 30s | ✅ | ❌ |
| Full npm packages | ✅ | ⚠️ Limited |
| WebSockets | ✅ | ❌ |
| Native modules | ✅ | ❌ |
The Bundle Size Problem
Edge functions have strict size limits:
- Vercel Edge: ~1MB (compressed)
- Cloudflare Workers: ~1MB (uncompressed)
That means you can’t just import your entire application:
// DON'T do this in Edge ❌
import lodash from "lodash"; // 70KB+ minified!
import moment from "moment"; // Another 300KB!
// DO this instead ✅
import { debounce, throttle } from "lodash-es";
import dayjs from "dayjs";
Real World Example: The E-commerce Scenario
Let’s say you’re building an e-commerce site. Here’s how different parts might run:
// ─────────────────────────────────────────────
// EDGE: Product catalog (fast, cached)
// ─────────────────────────────────────────────
export const runtime = "edge";
export async function GET(request) {
const products = await fetch("https://api.store.com/products").then(r =>
r.json()
);
return Response.json(products);
}
// ─────────────────────────────────────────────
// NODE: Checkout process (complex, secure)
// ─────────────────────────────────────────────
// No runtime specified = Node.js default
export async function POST(request) {
const cart = await request.json();
// Validate inventory
const inventory = await db.$transaction(
cart.items.map(item =>
db.inventory.findUnique({ where: { sku: item.sku } })
)
);
// Payment processing
const payment = await stripe.paymentIntents.create({
amount: cart.total,
currency: "usd",
});
// Create order
const order = await db.order.create({
/* ... */
});
return Response.json({
orderId: order.id,
clientSecret: payment.client_secret,
});
}
The Verdict
Here’s a quick decision guide:
| Scenario | Choose |
|---|---|
| Simple API with geo-personalization | Edge |
| A/B testing | Edge |
| Cached content delivery | Edge |
| Complex database queries | Node.js |
| File generation | Node.js |
| WebSocket connections | Node.js |
| Heavy computation | Node.js |
| Static content with light dynamic bits | Edge |
The Future is Hybrid
Here’s the beautiful part: you don’t have to choose one. Modern frameworks like Next.js 14+ let you mix and match:
// Mix and match based on needs
export const runtime = "edge"; // Fast path
// or
export const runtime = "nodejs"; // Full power
// or
export const runtime = "experimental-edge"; // New stuff
Your frontend can have:
- Edge API routes for personalization and caching
- Node API routes for complex operations
- Edge Middleware for routing and A/B testing
- Client-side code for interactivity
It’s like having both a sports car and an SUV in your garage. Different tools for different jobs.
So the next time you’re about to hit “Deploy,” pause and ask yourself: “Does my code really need a full kitchen, or would a microwave do?”
The answer might just make your users (and your wallet) very happy.
Happy coding, and may your cold starts be ever zero.