How Dynamic Workflows Bring Durable Execution to Multi-Tenant Platforms

By

From Static Deployments to Dynamic, Per-Tenant Workflows

When Cloudflare first launched Workers eight years ago, it was a direct-to-developer platform. Over time, the ecosystem expanded so that platforms could not only build on Workers but also enable their customers to ship code through multi-tenant applications. Today, we see AI applications where users describe what they want and the AI writes the implementation, multi-tenant SaaS where every customer’s business logic is runtime TypeScript the platform has never seen before, agents that write and run their own tools, and CI/CD products where every repository defines its own pipeline.

How Dynamic Workflows Bring Durable Execution to Multi-Tenant Platforms
Source: blog.cloudflare.com

This shift exposed a critical gap: traditional durable execution engines assume the workflow code is part of a single deployment. For example, Cloudflare Workflows requires that you declare a single class bound in your wrangler.jsonc—one binding, one class, per deploy. That works fine if you own all the code, but it breaks the moment you want to let your customer ship their own workflow. Whether you’re building an AI app platform that generates TypeScript per tenant, a CI/CD system with unique pipelines per repo, or an agent SDK where each agent writes its own durable plan, the workflow is different for every tenant, agent, or request. There is no single class to bind.

This is the same shape of problem that Dynamic Workers solved for compute and that Durable Object Facets solved for storage. Now, with Dynamic Workflows, we bridge durable execution and dynamic deployment.

Last month, we shipped the Dynamic Workers open beta, giving platforms a clean primitive for compute: hand the Workers runtime some code at runtime, and get back an isolated, sandboxed Worker on the same machine in single-digit milliseconds. Durable Object Facets extended the same idea to storage—each dynamically-loaded app can have its own SQLite database, spun up on demand, with the platform sitting in front as a supervisor. Artifacts provided a Git-native, versioned filesystem you can create by the tens of millions—one per agent, one per session, one per tenant.

Thus, we already have dynamic deployment for storage and source control. The missing piece was durable execution that follows the tenant.

Bridging the Gap: Dynamic Workflows

Cloudflare Workflows is our durable execution engine. It turns a run(event, step) function into a program where every step survives failures, can sleep for hours or days, can wait for external events, and resumes exactly where it left off when the isolate is recycled. It’s the right primitive for anything that has to “keep going” past a single request: onboarding flows, video transcoding pipelines, multi-stage billing, long-running agent loops, and—as of Workflows V2—up to 50,000 concurrent instances and 300 new instances per second per account, redesigned for the agentic era.

But until now, Workflows assumed the workflow code was part of your deployment. Dynamic Workflows removes that assumption. It allows you to hand the runtime a workflow definition at runtime—per tenant, per session, per agent—and receive a fully isolated, durable execution environment backed by its own SQLite database and filesystem.

How Dynamic Workflows Bring Durable Execution to Multi-Tenant Platforms
Source: blog.cloudflare.com

How It Works

When a platform calls Dynamic Workflows, it passes TypeScript code that defines the workflow—including steps, timeouts, and event handlers. The system creates a new workflow instance bound to a unique tenant or session. That instance gets its own isolated compute (via Dynamic Workers), its own SQLite database (via Durable Object Facets), and its own versioned filesystem (via Artifacts). The result is a fully self-contained, durable execution unit that follows the tenant across failures and restarts.

Because the worker and storage are dynamically provisioned, there is no need to pre-deploy a class. The platform simply defines the workflow code in real-time, and Cloudflare handles the isolation, scaling, and durability.

Key Use Cases

The Impact on Platform Builders

Dynamic Workflows completes the trio of dynamic primitives: compute (Dynamic Workers), storage (Durable Object Facets), and source control (Artifacts). Now, durable execution—the ability to run long-lived, fault-tolerant processes—is also per-tenant. This means platforms can offer their customers the full power of Cloudflare’s distributed edge without requiring them to manage infrastructure or pre-deploy code.

For platform builders, this unlocks new business models: charge per workflow instance, per tenant, or per step. Because everything is isolated, billing and metering become straightforward. And because the workflow engine is built into the same runtime that handles compute and storage, latency remains low.

Dynamic Workflows is currently in open beta. You can try it by using the Cloudflare Workers dashboard or API, and we look forward to seeing what platforms build with this new capability.

Tags:

Related Articles

Recommended

Discover More

From Mila's Miracle to Mass Production: Julia Vitarello's Quest for Customized Genetic MedicinesWhy Spending More on HDMI Cables Doesn't Improve Picture QualityBrowser Activity Creates Critical Data Leak Risk – Traditional DLP Controls Blind to Copy/Paste and AI PromptsNew X-ray Method Unveils Secrets of Vitamin B12 in Dilute SolutionsPython Ships Urgent Bugfix Releases: Version 3.14.2 and 3.13.11 Address Regressions and Security Vulnerabilities