Start with edge runtimes (minimize tail latency)
If your functions need to run as close to the user as possible — reducing latency from 100ms to under 10ms — Cloudflare Workers runs your logic at 200+ edge locations globally without cold starts. Workers is the right choice for request manipulation, authentication at the edge, A/B testing, and personalization where the latency difference is user-visible. The constraint is the V8 isolate runtime: Workers doesn't run full Node.js, so packages using native modules or Node.js built-ins need alternatives.
- Recommendation: Cloudflare Workers, Fastly Compute
Recommended starting points
Based on your constraints, these products typically fit best. Read each decision brief to confirm pricing behavior and limits match your reality.
Cloudflare Workers
Edge-first serverless runtime optimized for low-latency request/response compute near users, commonly used for middleware and edge API logic.
Fastly Compute
Edge compute runtime designed for performance-sensitive request handling and programmable networking patterns near users.
Why this recommendation
If your functions need to run as close to the user as possible — reducing latency from 100ms to under 10ms — Cloudflare Workers runs your logic at 200+ edge locations globally without cold starts. Workers is the right choice for request manipulation, authentication at the edge, A/B testing, and personalization where the latency difference is user-visible. The constraint is the V8 isolate runtime: Workers doesn't run full Node.js, so packages using native modules or Node.js built-ins need alternatives.