MoocchenDocsCloud Computing
Related
10 Essential Steps to Build a Serverless Spam Classifier with AWS and Scikit-LearnHow to Set Up Centralized Cross-Account Safeguards in Amazon BedrockMicrosoft Tops Forrester Sovereign Cloud Ranking Amid Global Regulatory SurgeAWS and Anthropic Forge Deeper AI Alliance: Claude Now Trained on Custom Chips, Cowork Debuts in BedrockAWS Unveils Decoupled Daemon Management for ECS Managed Instances – Platform Engineers Get Independent Control Over Monitoring and Logging Agents10 Key Enhancements to Kubernetes Memory QoS in v1.36AWS Transforms S3 Into High-Performance File System, Ending Decade-Old Storage TradeoffMastering Kubernetes Controller Health: New Staleness Solutions in v1.36

Dynamic Workflows: Durable Execution Customized Per Tenant

Last updated: 2026-05-03 00:12:52 · Cloud Computing

Cloudflare's platform has evolved from a direct-to-developer service to a multi-tenant ecosystem where platforms can let their own customers run code seamlessly. With the introduction of Dynamic Workers, Durable Object Facets, and Artifacts, Cloudflare provided dynamic compute, storage, and source control. Now, Dynamic Workflows bridges the gap between durable execution and dynamic deployment, enabling each tenant to have their own long-running workflow without sharing code. This Q&A explores how Dynamic Workflows works, why it was needed, and how it empowers platforms to offer customizable, resilient processes to their users.

What are Dynamic Workflows and why were they introduced?

Dynamic Workflows extend Cloudflare's durable execution engine to support per-tenant workflow definitions. Previously, Workflows required binding a fixed class in your deployment—meaning all workflows in an application used the same code. But platforms like AI-driven apps or CI/CD systems need each tenant to have a unique workflow logic. Dynamic Workflows solves this by allowing you to hand the engine a different function for each tenant at runtime, ensuring every workflow is isolated, sandboxed, and durable. This bridges the gap between dynamic deployment (where code is unknown at deploy time) and durable execution (where steps survive failures and long waits), making it possible to offer customizable, resilient processes to every user.

Dynamic Workflows: Durable Execution Customized Per Tenant
Source: blog.cloudflare.com

How did Cloudflare's platform evolve to require dynamic deployment?

When Workers launched eight years ago, it was a direct-to-developer platform. As the ecosystem grew, Cloudflare enabled platforms to build on Workers and then allow their own customers to ship code—creating multi-tenant applications. Use cases include AI agents that write and run their own tools, SaaS products where each customer's logic is runtime TypeScript, and CI/CD pipelines defined per repository. This shift meant that code could no longer be fixed at deploy time; it had to be dynamic. The Dynamic Workers open beta gave these platforms a primitive for compute: passing code at runtime to get an isolated Worker instantly. Durable Object Facets extended that to storage, and Artifacts to source control. Dynamic Workflows now completes the picture for durable execution.

What limitations did the original Workflows have for multi-tenant scenarios?

The original Cloudflare Workflows assumed that workflow code is part of your deployment—a single class bound in wrangler.jsonc. That works well when you own all the code, but fails when you want to let your customer ship their own workflow. For example, an app platform where AI writes TypeScript for each tenant, or a CI/CD product where every repository has a unique pipeline, cannot have one fixed class. The code is different for every tenant, agent, or session. This is exactly the problem that Dynamic Workers solved for compute, and Dynamic Workflows now solves for durable execution—allowing each instance to use a dynamically provided function instead of a pre-bound class.

Dynamic Workers provide the compute primitive: hand the Workers runtime some TypeScript at runtime and get an isolated, sandboxed Worker in milliseconds. Durable Object Facets extend that idea to storage, giving each dynamically-loaded app its own SQLite database on demand, with the platform acting as a supervisor. Artifacts offer a Git-native, versioned filesystem that can be created by the tens of millions per agent, session, or tenant. Dynamic Workflows completes the triad by adding durable execution—long-running processes that survive failures, sleep for hours, wait for events, and resume exactly where they left off. Together, these four components allow platforms to deploy full, dynamic applications with compute, storage, version control, and workflows all per tenant.

Dynamic Workflows: Durable Execution Customized Per Tenant
Source: blog.cloudflare.com

What are the key technical capabilities of Dynamic Workflows?

Dynamic Workflows inherits the durability of the original Workflows engine: a run(event, step) function becomes a program where every step survives failures, can sleep for hours or days, can wait for external events, and resumes exactly where it left off even if the isolate is recycled. Key improvements include support for up to 50,000 concurrent instances and 300 new instances per second per account, redesigned for agentic era workloads. The dynamic aspect means that the workflow code is not fixed at deploy time but is provided per invocation. This enables each tenant to have their own unique business logic while still benefiting from durable execution, automatic retries, and state management.

What real-world use cases can Dynamic Workflows enable?

Dynamic Workflows is ideal for any multi-tenant platform that needs per-user long-running processes. Examples include: AI application platforms where an AI writes TypeScript code for each tenant, and that code defines the workflow logic. CI/CD products where each repository defines its own pipeline—now each pipeline is a durable workflow that can run builds, tests, and deployments. Agent SDKs where each agent creates its own durable plan of actions. Multi-stage billing systems that vary per customer. Video transcoding pipelines with different configurations per user. In all these cases, the workflow code is different per entity, and Dynamic Workflows ensures that each one is isolated, durable, and can run for extended periods without shared state conflicts.

How does Dynamic Workflows ensure isolation and security for different tenants?

Dynamic Workflows builds on the isolation guarantees of Dynamic Workers. Each workflow instance runs in its own sandboxed Worker, on the same machine, with single-digit millisecond startup time. The workflow code is separate from the platform's own code and from other tenants' workflows. Storage is also isolated via Durable Object Facets, giving each tenant its own SQLite database. The platform sits in front as a supervisor, controlling resource limits and enforcement. Additionally, because code is provided at runtime and not shared, there is no risk of one tenant's workflow interfering with another's. This design ensures strong multi-tenant security while maintaining the performance and durability of the underlying engine.