diff --git a/src/content/docs/agents/model-context-protocol/mcp-servers-for-cloudflare.mdx b/src/content/docs/agents/model-context-protocol/mcp-servers-for-cloudflare.mdx index 4f8bfe21a89fb08..d3f611a495d6ffd 100644 --- a/src/content/docs/agents/model-context-protocol/mcp-servers-for-cloudflare.mdx +++ b/src/content/docs/agents/model-context-protocol/mcp-servers-for-cloudflare.mdx @@ -7,13 +7,86 @@ sidebar: order: 8 --- -Cloudflare runs a catalog of managed remote MCP Servers which you can connect to using OAuth on clients like [Claude](https://modelcontextprotocol.io/quickstart/user), [Windsurf](https://docs.windsurf.com/windsurf/cascade/mcp), our own [AI Playground](https://playground.ai.cloudflare.com/) or any [SDK that supports MCP](/agents/api-reference/mcp-client-api/). +Cloudflare runs a catalog of managed remote MCP servers which you can connect to using OAuth on clients like [Claude](https://modelcontextprotocol.io/quickstart/user), [Windsurf](https://docs.windsurf.com/windsurf/cascade/mcp), our own [AI Playground](https://playground.ai.cloudflare.com/) or any [SDK that supports MCP](https://github.com/cloudflare/agents/tree/main/packages/agents/src/mcp). -These MCP servers allow your MCP Client to read configurations from your account, process information, make suggestions based on data, and even make those suggested changes for you. All of these actions can happen across Cloudflare's many services including application development, security and performance. They support both the `streamable-http` transport via `/mcp` and the `sse` transport (deprecated) via `/sse`. +These MCP servers allow your MCP client to read configurations from your account, process information, make suggestions based on data, and even make those suggested changes for you. All of these actions can happen across Cloudflare's many services including application development, security and performance. They support both the `streamable-http` transport via `/mcp` and the `sse` transport (deprecated) via `/sse`. + +## Cloudflare API MCP server + +The [Cloudflare API MCP server](https://github.com/cloudflare/mcp) provides access to the entire [Cloudflare API](/api/) — over 2,500 endpoints across DNS, Workers, R2, Zero Trust, and every other product — through just two tools: `search()` and `execute()`. + +It uses [Codemode](/agents/api-reference/codemode/), a technique where the model writes JavaScript against a typed representation of the OpenAPI spec and the Cloudflare API client, rather than loading individual tool definitions for each endpoint. The generated code runs inside an isolated [Dynamic Worker](/workers/runtime-apis/bindings/worker-loader/) sandbox. + +This approach uses approximately 1,000 tokens regardless of how many API endpoints exist. An equivalent MCP server that exposed every endpoint as a native tool would consume over 1 million tokens — more than the entire context window of most foundation models. + +| Approach | Tools | Token cost | +| --------------------------------- | ----- | ---------- | +| Native MCP (full schemas) | 2,594 | ~1,170,000 | +| Native MCP (required params only) | 2,594 | ~244,000 | +| Codemode | 2 | ~1,000 | + +### Connect to the Cloudflare API MCP server + +Add the following configuration to your MCP client: + +```json +{ + "mcpServers": { + "cloudflare-api": { + "url": "https://mcp.cloudflare.com/mcp" + } + } +} +``` + +When you connect, you will be redirected to Cloudflare to authorize via OAuth and select the permissions to grant to your agent. + +For CI/CD or automation, you can create a [Cloudflare API token](https://dash.cloudflare.com/profile/api-tokens) with the permissions you need and pass it as a bearer token in the `Authorization` header. Both user tokens and account tokens are supported. + +For more information, refer to the [Cloudflare MCP repository](https://github.com/cloudflare/mcp). + +### Install via agent and IDE plugins + +You can install the [Cloudflare Skills plugin](https://github.com/cloudflare/skills), which bundles the Cloudflare MCP servers alongside contextual skills and slash commands for building on Cloudflare. The plugin works with any agent that supports the Agent Skills standard, including Claude Code, OpenCode, OpenAI Codex, and Pi. + +#### Claude Code + +Install using the [plugin marketplace](https://code.claude.com/docs/en/discover-plugins#add-from-github): + +```txt +/plugin marketplace add cloudflare/skills +``` + +#### Cursor + +Install from the **Cursor Marketplace**, or add manually via **Settings** > **Rules** > **Add Rule** > **Remote Rule (Github)** with `cloudflare/skills`. + +#### npx skills + +Install using the [`npx skills`](https://skills.sh) CLI: + +```sh +npx skills add https://github.com/cloudflare/skills +``` + +#### Clone or copy + +Clone the [cloudflare/skills](https://github.com/cloudflare/skills) repository and copy the skill folders into the appropriate directory for your agent: + +| Agent | Skill directory | Docs | +| ------------ | ---------------------------- | ---------------------------------------------------------------------------------------------------- | +| Claude Code | `~/.claude/skills/` | [Claude Code skills](https://code.claude.com/docs/en/skills) | +| Cursor | `~/.cursor/skills/` | [Cursor skills](https://cursor.com/docs/context/skills) | +| OpenCode | `~/.config/opencode/skills/` | [OpenCode skills](https://opencode.ai/docs/skills/) | +| OpenAI Codex | `~/.codex/skills/` | [OpenAI Codex skills](https://developers.openai.com/codex/skills/) | +| Pi | `~/.pi/agent/skills/` | [Pi coding agent skills](https://github.com/badlogic/pi-mono/tree/main/packages/coding-agent#skills) | + +## Product-specific MCP servers + +In addition to the Cloudflare API MCP server, Cloudflare provides product-specific MCP servers for targeted use cases: | Server Name | Description | Server URL | | ----------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- | ---------------------------------------------- | -| [Cloudflare API server](https://github.com/cloudflare/mcp) | Access the full Cloudflare API (2500+ endpoints) via Code Mode | `https://mcp.cloudflare.com/mcp` | | [Documentation server](https://github.com/cloudflare/mcp-server-cloudflare/tree/main/apps/docs-vectorize) | Get up to date reference information on Cloudflare | `https://docs.mcp.cloudflare.com/mcp` | | [Workers Bindings server](https://github.com/cloudflare/mcp-server-cloudflare/tree/main/apps/workers-bindings) | Build Workers applications with storage, AI, and compute primitives | `https://bindings.mcp.cloudflare.com/mcp` | | [Workers Builds server](https://github.com/cloudflare/mcp-server-cloudflare/tree/main/apps/workers-builds) | Get insights and manage your Cloudflare Workers Builds | `https://builds.mcp.cloudflare.com/mcp` | @@ -31,4 +104,4 @@ These MCP servers allow your MCP Client to read configurations from your account | [GraphQL server](https://github.com/cloudflare/mcp-server-cloudflare/tree/main/apps/graphql/) | Get analytics data using Cloudflare's GraphQL API | `https://graphql.mcp.cloudflare.com/mcp` | | [Agents SDK Documentation server](https://github.com/cloudflare/agents/tree/main/site/agents) | Token-efficient search of the Cloudflare Agents SDK documentation | `https://agents.cloudflare.com/mcp` | -Check our [GitHub page](https://github.com/cloudflare/mcp-server-cloudflare) to know how to use Cloudflare's remote MCP servers with different MCP clients. +Check the [GitHub page](https://github.com/cloudflare/mcp-server-cloudflare) to learn how to use Cloudflare's remote MCP servers with different MCP clients. diff --git a/src/content/docs/workers/runtime-apis/bindings/worker-loader.mdx b/src/content/docs/workers/runtime-apis/bindings/worker-loader.mdx index 265efe7e7c1443a..e584215d05e5007 100644 --- a/src/content/docs/workers/runtime-apis/bindings/worker-loader.mdx +++ b/src/content/docs/workers/runtime-apis/bindings/worker-loader.mdx @@ -3,14 +3,9 @@ pcx_content_type: configuration title: Dynamic Worker Loaders head: [] description: The Dynamic Worker Loader API, which allows dynamically spawning isolates that run arbitrary code. - --- -import { - Type, - MetaInfo, - WranglerConfig -} from "~/components"; +import { Type, MetaInfo, WranglerConfig } from "~/components"; :::note[Dynamic Worker Loading is in closed beta] @@ -25,11 +20,20 @@ An isolate is like a lightweight container. [The Workers platform uses isolates Isolates are much cheaper than containers. You can start an isolate in milliseconds, and it's fine to start one just to run a snippet of code and immediately throw away. There's no need to worry about pooling isolates or trying to reuse already-warm isolates, as you would need to do with containers. Worker Loaders also enable **sandboxing** of code, meaning that you can strictly limit what the code is allowed to do. In particular: -* You can arrange to intercept or simply block all network requests made by the Worker within. -* You can supply the sandboxed Worker with custom bindings to represent specific resources which it should be allowed to access. + +- You can arrange to intercept or simply block all network requests made by the Worker within. +- You can supply the sandboxed Worker with custom bindings to represent specific resources which it should be allowed to access. With proper sandboxing configured, you can safely run code you do not trust in a dynamic isolate. +## Codemode + +A primary use case for Dynamic Worker Loaders is [Codemode](/agents/api-reference/codemode/) in the [Agents SDK](/agents/). Codemode converts your tools into typed TypeScript APIs and gives the LLM a single "write code" tool. The generated code runs in an isolated Worker sandbox, which lets AI agents chain multiple tool calls in one execution and reduces round-trips through the model. + +Codemode works with both standard AI SDK tools and [MCP](/agents/model-context-protocol/) tools. + +## Basic usage + A Worker Loader is a binding with just one method, `get()`, which loads an isolate. Example usage: ```js @@ -37,35 +41,35 @@ let id = "foo"; // Get the isolate with the given ID, creating it if no such isolate exists yet. let worker = env.LOADER.get(id, async () => { - // If the isolate does not already exist, this callback is invoked to fetch - // the isolate's Worker code. - - return { - compatibilityDate: "2025-06-01", - - // Specify the worker's code (module files). - mainModule: "foo.js", - modules: { - "foo.js": - "export default {\n" + - " fetch(req, env, ctx) { return new Response('Hello'); }\n" + - "}\n", - }, - - // Specify the dynamic Worker's environment (`env`). This is specified - // as a JavaScript object, exactly as you want it to appear to the - // child Worker. It can contain basic serializable types as well as - // Service Bindings (see below). - env: { - SOME_ENV_VAR: 123 - }, - - // To block the worker from talking to the internet using `fetch()` or - // `connect()`, set `globalOutbound` to `null`. You can also set this - // to any service binding, to have calls be intercepted and redirected - // to that binding. - globalOutbound: null, - }; + // If the isolate does not already exist, this callback is invoked to fetch + // the isolate's Worker code. + + return { + compatibilityDate: "2025-06-01", + + // Specify the worker's code (module files). + mainModule: "foo.js", + modules: { + "foo.js": + "export default {\n" + + " fetch(req, env, ctx) { return new Response('Hello'); }\n" + + "}\n", + }, + + // Specify the dynamic Worker's environment (`env`). This is specified + // as a JavaScript object, exactly as you want it to appear to the + // child Worker. It can contain basic serializable types as well as + // Service Bindings (see below). + env: { + SOME_ENV_VAR: 123, + }, + + // To block the worker from talking to the internet using `fetch()` or + // `connect()`, set `globalOutbound` to `null`. You can also set this + // to any service binding, to have calls be intercepted and redirected + // to that binding. + globalOutbound: null, + }; }); // Now you can get the Worker's entrypoint and send requests to it. @@ -75,7 +79,7 @@ await defaultEntrypoint.fetch("http://example.com"); // You can get non-default entrypoints as well, and specify the // `ctx.props` value to be delivered to the entrypoint. let someEntrypoint = worker.getEntrypoint("SomeEntrypointClass", { - props: {someProp: 123} + props: { someProp: 123 }, }); ``` @@ -89,9 +93,9 @@ To add a dynamic worker loader binding to your worker, add it to your Wrangler c { "worker_loaders": [ { - "binding": "LOADER" - } - ] + "binding": "LOADER", + }, + ], } ``` @@ -101,7 +105,11 @@ To add a dynamic worker loader binding to your worker, add it to your Wrangler c ### `get` -get(id , getCodeCallback ): + + get(id , getCodeCallback{" "} + + ): + Loads a Worker with the given ID, returning a `WorkerStub` which may be used to invoke the Worker. @@ -143,12 +151,12 @@ A dictionary object mapping module names to their string contents. If the module A module's content can also be specified as an object, in order to specify its type independent from the name. The allowed objects are: -* `{js: string}`: A JavaScript module, using ES modules syntax for imports and exports. -* `{cjs: string}`: A CommonJS module, using `require()` syntax for imports. -* `{py: string}`: A [Python module](/workers/languages/python/), but see the warning below. -* `{text: string}`: An importable string value. -* `{data: ArrayBuffer}`: An importable `ArrayBuffer` value. -* `{json: object}`: An importable object. The value must be JSON-serializable. However, note that value is provided as a parsed object, and is delivered as a parsed object; neither side actually sees the JSON serialization. +- `{js: string}`: A JavaScript module, using ES modules syntax for imports and exports. +- `{cjs: string}`: A CommonJS module, using `require()` syntax for imports. +- `{py: string}`: A [Python module](/workers/languages/python/), but see the warning below. +- `{text: string}`: An importable string value. +- `{data: ArrayBuffer}`: An importable `ArrayBuffer` value. +- `{json: object}`: An importable object. The value must be JSON-serializable. However, note that value is provided as a parsed object, and is delivered as a parsed object; neither side actually sees the JSON serialization. :::caution[Warning] @@ -174,27 +182,27 @@ For example: import { WorkerEntrypoint } from "cloudflare:workers"; export class Greeter extends WorkerEntrypoint { - fetch(request) { - return new Response(`Hello, ${this.ctx.props.name}!`); - } + fetch(request) { + return new Response(`Hello, ${this.ctx.props.name}!`); + } } export default { - async fetch(request, env, ctx) { - let worker = env.LOADER.get("alice", () => { - return { - // Redirect the worker's global outbound to send all requests - // to the `Greeter` class, filling in `ctx.props.name` with - // the name "Alice", so that it always responds "Hello, Alice!". - globalOutbound: ctx.exports.Greeter({props: {name: "Alice"}}), - - // ... code ... - } - }); - - return worker.getEntrypoint().fetch(request); - } -} + async fetch(request, env, ctx) { + let worker = env.LOADER.get("alice", () => { + return { + // Redirect the worker's global outbound to send all requests + // to the `Greeter` class, filling in `ctx.props.name` with + // the name "Alice", so that it always responds "Hello, Alice!". + globalOutbound: ctx.exports.Greeter({ props: { name: "Alice" } }), + + // ... code ... + }; + }); + + return worker.getEntrypoint().fetch(request); + }, +}; ``` #### env @@ -205,8 +213,8 @@ Using this, you can provide custom bindings to the Worker. `env` is serialized and transferred into the dynamic Worker, where it is used directly as the value of `env` there. It may contain: -* [Structured clonable types](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm). -* [Service Bindings](/workers/runtime-apis/bindings/service-bindings), including [loopback bindings from `ctx.exports`](/workers/runtime-apis/context/#exports). +- [Structured clonable types](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm). +- [Service Bindings](/workers/runtime-apis/bindings/service-bindings), including [loopback bindings from `ctx.exports`](/workers/runtime-apis/context/#exports). The second point is the key to creating custom bindings: you can define a binding with any arbitrary API, by defining a [`WorkerEntrypoint` class](/workers/runtime-apis/bindings/service-bindings/rpc) implementing an RPC API, and then giving it to the dynamic Worker as a Service Binding. @@ -217,28 +225,28 @@ import { WorkerEntrypoint } from "cloudflare:workers"; // Implement a binding which can be called by the dynamic Worker. export class Greeter extends WorkerEntrypoint { - greet() { - return `Hello, ${this.ctx.props.name}!`; - } + greet() { + return `Hello, ${this.ctx.props.name}!`; + } } export default { - async fetch(request, env, ctx) { - let worker = env.LOADER.get("alice", () => { - return { - env: { - // Provide a binding which has a method greet() which can be called - // to receive a greeting. The binding knows the Worker's name. - GREETER: ctx.exports.Greeter({props: {name: "Alice"}}) - }, - - // ... code ... - } - }); - - return worker.getEntrypoint().fetch(request); - } -} + async fetch(request, env, ctx) { + let worker = env.LOADER.get("alice", () => { + return { + env: { + // Provide a binding which has a method greet() which can be called + // to receive a greeting. The binding knows the Worker's name. + GREETER: ctx.exports.Greeter({ props: { name: "Alice" } }), + }, + + // ... code ... + }; + }); + + return worker.getEntrypoint().fetch(request); + }, +}; ``` #### tails Optional @@ -249,35 +257,35 @@ You may specify one or more [Tail Workers](/workers/observability/logs/tail-work import { WorkerEntrypoint } from "cloudflare:workers"; export default { - async fetch(request, env, ctx) { - let worker = env.LOADER.get("alice", () => { - return { - // Send logs, errors, etc. to `LogTailer`. We pass `name` in the - // `ctx.props` so that `LogTailer` knows what generated the logs. - // (You can pass anything you want in `props`.) - tails: [ ctx.exports.LogTailer({props: {name: "alice"}}) ], - - // ... code ... - } - }); - - return worker.getEntrypoint().fetch(request); - } -} + async fetch(request, env, ctx) { + let worker = env.LOADER.get("alice", () => { + return { + // Send logs, errors, etc. to `LogTailer`. We pass `name` in the + // `ctx.props` so that `LogTailer` knows what generated the logs. + // (You can pass anything you want in `props`.) + tails: [ctx.exports.LogTailer({ props: { name: "alice" } })], + + // ... code ... + }; + }); + + return worker.getEntrypoint().fetch(request); + }, +}; export class LogTailer extends WorkerEntrypoint { - async tail(events) { - let name = this.ctx.props.name; - - // Send the logs off to our log endpoint, specifying the worker name in - // the URL. - // - // Note that `events` will always be an array of size 1 in this scenario, - // describing the event delivered to the dynamically-loaded Worker. - await fetch(`https://example.com/submit-logs/${name}`, { - method: "POST", - body: JSON.stringify(events), - }); - } + async tail(events) { + let name = this.ctx.props.name; + + // Send the logs off to our log endpoint, specifying the worker name in + // the URL. + // + // Note that `events` will always be an array of size 1 in this scenario, + // describing the event delivered to the dynamically-loaded Worker. + await fetch(`https://example.com/submit-logs/${name}`, { + method: "POST", + body: JSON.stringify(events), + }); + } } ```