Cloudflare recently released Dynamic Worker into open beta, available to all paid Workers users. The API allows a Cloudflare Worker to instantiate a new Worker at runtime with dynamically specified code, each running in its own isolated sandbox. The feature is aimed squarely at the growing need to safely execute AI-generated code, a challenge that most teams currently address with containers.

The core pitch is performance. Dynamic Workers use V8 isolates, the same JavaScript engine underlying Google Chrome and the entire Cloudflare Workers platform for the past eight years, rather than Linux containers. According to Cloudflare, isolates start in a few milliseconds and use a few megabytes of memory, making them roughly 100x faster to boot and 10-100x more memory efficient than typical containers. Kenton Varda, Sunil Pai, and Ketan Gupta wrote in the announcement blog post:

That means that if you want to start a new isolate for every user request, on-demand, to run one snippet of code, then throw it away, you can.

The feature builds on Cloudflare’s Code Mode concept, introduced in September 2025, which proposes that AI agents should perform tasks by writing and executing code against typed APIs rather than making sequential tool calls. Cloudflare has previously demonstrated that converting an MCP server into a TypeScript API and having agents write code against it can reduce token usage by 81% compared to traditional tool-calling patterns. Their own Cloudflare MCP server uses this approach to expose the entire Cloudflare API through just two tools in under 1,000 tokens.

A notable design choice is using TypeScript interfaces rather than OpenAPI specifications to define the APIs available to agent-generated code. The blog post includes a side-by-side comparison: a chat room API expressed as a TypeScript interface takes roughly 15 lines, while the equivalent OpenAPI spec runs to over 60 lines of YAML. Cloudflare argues that TypeScript interfaces are more token-efficient for LLM consumption and easier to reason about for both agents and developers.

Dynamic Workers connect to host APIs through Cap’n Web RPC bridges that operate transparently across the security boundary. The sandbox can also intercept outbound HTTP requests for credential injection, adding auth tokens on the way out so the agent code never sees secret credentials directly.

The ephemeral nature of isolates also carries a security advantage over warm container pools. Teams that keep containers alive to avoid cold-start delays often end up reusing them across multiple tasks, weakening the isolation between agent executions. Because isolates are cheap enough to create and destroy per request, that temptation disappears.

Dynamic Workers support two loading modes: load() for one-time execution of agent-generated code, and get(), which caches a Worker by ID so it can stay warm across requests, making the feature applicable to longer-lived application workloads as well.

Cloudflare acknowledges, however, that isolate-based sandboxing presents a more complex attack surface than hardware virtual machines, noting that V8 security bugs are more common than hypervisor vulnerabilities. Their mitigation strategy includes automatic deployment of V8 security patches to production within hours, a custom second-layer sandbox with dynamic risk-based tenant cordoning, hardware-level protections using MPK (Memory Protection Keys), and novel Spectre defenses developed in collaboration with academic researchers.

Alongside the open beta, Cloudflare released several supporting libraries. @cloudflare/codemode simplifies running model-generated code against AI tools using Dynamic Workers. @cloudflare/worker-bundler handles npm dependency resolution and bundling at runtime. @cloudflare/shell provides a virtual filesystem with transactional batch writes, persistent storage backed by SQLite and R2, and coarse-grained operations designed to minimize RPC round-trips from agent code.

Zite, an app platform where users build CRUD applications through a chat interface, is already using Dynamic Workers in production, reporting millions of daily execution requests.

The launch positions Cloudflare on one side of an emerging architectural divide in AI agent infrastructure. Some platforms are investing in long-lived agent environments with persistent memory and heavier runtimes, while Cloudflare is betting that a large class of agent workloads, particularly high-volume, web-facing systems, are better served by execution layers as ephemeral as the requests themselves. Whether that split hardens into distinct market segments or converges remains an open question.

Dynamic Workers are priced at $0.002 per unique Worker loaded per day, on top of standard Workers’ CPU and invocation pricing. The per-load charge is waived during the beta period. The primary constraint compared to containers is language support: while Workers technically support Python and WebAssembly, JavaScript is the practical choice for on-demand agent-generated code due to faster load times. Cloudflare frames this as a non-issue, arguing that LLMs are fluent in JavaScript and that the language’s web-native sandboxing design makes it the right fit for the job.

Dynamic Workers are available now to all users on the Workers Paid plan.