From 6bca67da6ad353eaf4862323e67f513c623cdac4 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Fri, 17 Apr 2026 11:32:01 +0200 Subject: [PATCH] docs: add agent-skills guide and SEO frontmatter Add a new getting-started/agent-skills.md guide explaining how to use @tanstack/intent install to wire the bundled Agent Skills from @tanstack/ai and @tanstack/ai-code-mode into Claude Code, Cursor, GitHub Copilot, and other AI coding assistants, register it in docs/config.json, and add a disambiguation callout on code-mode-with-skills.md so readers searching for "skills" land on the correct page. Add description and keywords frontmatter to all 64 hand-authored docs for search-engine discoverability. Auto-generated TypeDoc reference pages under docs/reference/ are left untouched since they would be overwritten on the next regeneration. A follow-up PR on tanstack.com will wire these fields into the document head. --- docs/adapters/anthropic.md | 9 ++ docs/adapters/elevenlabs.md | 9 ++ docs/adapters/fal.md | 10 ++ docs/adapters/gemini.md | 10 ++ docs/adapters/grok.md | 9 ++ docs/adapters/groq.md | 9 ++ docs/adapters/ollama.md | 10 ++ docs/adapters/openai.md | 11 ++ docs/adapters/openrouter.md | 9 ++ docs/advanced/extend-adapter.md | 9 ++ docs/advanced/middleware.md | 10 ++ docs/advanced/multimodal-content.md | 11 ++ docs/advanced/observability.md | 9 ++ docs/advanced/per-model-type-safety.md | 9 ++ docs/advanced/runtime-adapter-switching.md | 8 ++ docs/advanced/tree-shaking.md | 8 ++ docs/api/ai-client.md | 9 ++ docs/api/ai-preact.md | 8 ++ docs/api/ai-react.md | 8 ++ docs/api/ai-solid.md | 9 ++ docs/api/ai-svelte.md | 9 ++ docs/api/ai-vue.md | 9 ++ docs/api/ai.md | 9 ++ docs/architecture/approval-flow-processing.md | 15 +++ docs/chat/agentic-cycle.md | 8 ++ docs/chat/connection-adapters.md | 9 ++ docs/chat/streaming.md | 9 ++ docs/chat/structured-outputs.md | 10 ++ docs/chat/thinking-content.md | 10 ++ docs/code-mode/client-integration.md | 9 ++ docs/code-mode/code-mode-isolates.md | 10 ++ docs/code-mode/code-mode-with-skills.md | 12 ++ docs/code-mode/code-mode.md | 9 ++ docs/community-adapters/cencori.md | 9 ++ docs/community-adapters/cloudflare.md | 10 ++ docs/community-adapters/decart.md | 7 ++ docs/community-adapters/guide.md | 11 +- docs/community-adapters/mynth.md | 15 +++ docs/community-adapters/soniox.md | 8 ++ docs/comparison/vercel-ai-sdk.md | 10 ++ docs/config.json | 4 + docs/getting-started/agent-skills.md | 109 ++++++++++++++++++ docs/getting-started/devtools.md | 9 ++ docs/getting-started/overview.md | 10 ++ docs/getting-started/quick-start-server.md | 10 ++ docs/getting-started/quick-start-svelte.md | 10 ++ docs/getting-started/quick-start-vue.md | 10 ++ docs/getting-started/quick-start.md | 10 ++ docs/media/generation-hooks.md | 10 ++ docs/media/generations.md | 10 ++ docs/media/image-generation.md | 10 ++ docs/media/realtime-chat.md | 11 ++ docs/media/text-to-speech.md | 9 ++ docs/media/transcription.md | 9 ++ docs/media/video-generation.md | 9 ++ docs/migration/migration.md | 9 ++ docs/protocol/chunk-definitions.md | 9 ++ docs/protocol/http-stream-protocol.md | 8 ++ docs/protocol/sse-protocol.md | 8 ++ docs/tools/client-tools.md | 9 ++ docs/tools/lazy-tool-discovery.md | 9 ++ docs/tools/server-tools.md | 8 ++ docs/tools/tool-approval.md | 9 ++ docs/tools/tool-architecture.md | 9 ++ docs/tools/tools.md | 10 ++ 65 files changed, 709 insertions(+), 1 deletion(-) create mode 100644 docs/getting-started/agent-skills.md diff --git a/docs/adapters/anthropic.md b/docs/adapters/anthropic.md index 2e8f36b2f..1c1a63b9d 100644 --- a/docs/adapters/anthropic.md +++ b/docs/adapters/anthropic.md @@ -2,6 +2,15 @@ title: Anthropic id: anthropic-adapter order: 2 +description: "Use Anthropic Claude models with TanStack AI — Claude Sonnet 4.5, Claude Opus, and more via the @tanstack/ai-anthropic adapter." +keywords: + - tanstack ai + - anthropic + - claude + - claude sonnet 4.5 + - claude opus + - adapter + - llm --- The Anthropic adapter provides access to Claude models, including Claude Sonnet 4.5, Claude Opus 4.5, and more. diff --git a/docs/adapters/elevenlabs.md b/docs/adapters/elevenlabs.md index 88590aaf3..83a108ef9 100644 --- a/docs/adapters/elevenlabs.md +++ b/docs/adapters/elevenlabs.md @@ -2,6 +2,15 @@ title: ElevenLabs id: elevenlabs-adapter order: 9 +description: "Build realtime voice-to-voice conversational AI with ElevenLabs agents in TanStack AI via the @tanstack/ai-elevenlabs adapter." +keywords: + - tanstack ai + - elevenlabs + - realtime voice ai + - conversational ai + - voice chat + - voice agents + - adapter --- The ElevenLabs adapter provides realtime conversational voice AI for TanStack AI. Unlike text-focused adapters, the ElevenLabs adapter is **voice-focused** -- it integrates with TanStack AI's realtime system to enable voice-to-voice conversations. It does not support `chat()`, `embedding()`, or `summarize()`. diff --git a/docs/adapters/fal.md b/docs/adapters/fal.md index 5be52698e..22e4c4e2b 100644 --- a/docs/adapters/fal.md +++ b/docs/adapters/fal.md @@ -1,6 +1,16 @@ --- title: fal.ai id: fal-adapter +description: "Generate images and videos with 600+ models on fal.ai using TanStack AI — Nano Banana Pro, FLUX, and more via the @tanstack/ai-fal adapter." +keywords: + - tanstack ai + - fal.ai + - fal + - image generation + - video generation + - flux + - nano banana + - adapter --- The fal.ai adapter provides access to 600+ models on the fal.ai platform for image generation and video generation. Unlike text-focused adapters, the fal adapter is **media-focused** — it supports `generateImage()` and `generateVideo()` but does not support `chat()` or tools. Audio and speech support are coming soon. diff --git a/docs/adapters/gemini.md b/docs/adapters/gemini.md index 84b5e9515..5c3925e61 100644 --- a/docs/adapters/gemini.md +++ b/docs/adapters/gemini.md @@ -2,6 +2,16 @@ title: Google Gemini id: gemini-adapter order: 3 +description: "Use Google Gemini with TanStack AI — text, image generation via Imagen and Gemini native (NanoBanana), and experimental TTS via @tanstack/ai-gemini." +keywords: + - tanstack ai + - gemini + - google gemini + - imagen + - nano banana + - image generation + - adapter + - google ai --- The Google Gemini adapter provides access to Google's Gemini models, including text generation, image generation with both Imagen and Gemini native image models (NanoBanana), and experimental text-to-speech. diff --git a/docs/adapters/grok.md b/docs/adapters/grok.md index b73662b9c..e38480896 100644 --- a/docs/adapters/grok.md +++ b/docs/adapters/grok.md @@ -2,6 +2,15 @@ title: Grok (xAI) id: grok-adapter order: 5 +description: "Use xAI Grok models with TanStack AI — Grok 4.1, Grok 4, Grok 3, and Grok 2 Image generation via @tanstack/ai-grok." +keywords: + - tanstack ai + - grok + - xai + - grok 4 + - grok 4.1 + - image generation + - adapter --- The Grok adapter provides access to xAI's Grok models, including Grok 4.1, Grok 4, Grok 3, and image generation with Grok 2 Image. diff --git a/docs/adapters/groq.md b/docs/adapters/groq.md index b6fcf4991..40788d5f1 100644 --- a/docs/adapters/groq.md +++ b/docs/adapters/groq.md @@ -2,6 +2,15 @@ title: Groq id: groq-adapter order: 6 +description: "Use Groq's fast inference API with TanStack AI for low-latency LLM responses — Llama and other open-weight models via @tanstack/ai-groq." +keywords: + - tanstack ai + - groq + - fast inference + - llama + - low latency + - adapter + - llm --- The Groq adapter provides access to Groq's fast inference API, featuring the world's fastest LLM inference. diff --git a/docs/adapters/ollama.md b/docs/adapters/ollama.md index 03bad0052..0a83335a4 100644 --- a/docs/adapters/ollama.md +++ b/docs/adapters/ollama.md @@ -2,6 +2,16 @@ title: Ollama id: ollama-adapter order: 4 +description: "Run local LLMs with Ollama in TanStack AI for private, no-cost AI on your own hardware via the @tanstack/ai-ollama adapter." +keywords: + - tanstack ai + - ollama + - local llm + - self-hosted + - privacy + - llama + - offline ai + - adapter --- The Ollama adapter provides access to local models running via Ollama, allowing you to run AI models on your own infrastructure with full privacy and no API costs. diff --git a/docs/adapters/openai.md b/docs/adapters/openai.md index eda51463d..7fa6fd126 100644 --- a/docs/adapters/openai.md +++ b/docs/adapters/openai.md @@ -2,6 +2,17 @@ title: OpenAI id: openai-adapter order: 1 +description: "Use OpenAI models with TanStack AI — GPT-4o, GPT-5, DALL-E image generation, TTS, and Whisper transcription via @tanstack/ai-openai." +keywords: + - tanstack ai + - openai + - gpt-4o + - gpt-5 + - dall-e + - whisper + - openai tts + - adapter + - chatgpt --- The OpenAI adapter provides access to OpenAI's models, including GPT-4o, GPT-5, image generation (DALL-E), text-to-speech (TTS), and audio transcription (Whisper). diff --git a/docs/adapters/openrouter.md b/docs/adapters/openrouter.md index b51c18ca4..07459cdb0 100644 --- a/docs/adapters/openrouter.md +++ b/docs/adapters/openrouter.md @@ -1,6 +1,15 @@ --- title: OpenRouter Adapter id: openrouter-adapter +description: "Access 300+ LLMs from OpenAI, Anthropic, Google, Meta, Mistral, and more through a single API with OpenRouter in TanStack AI." +keywords: + - tanstack ai + - openrouter + - multi-provider + - unified api + - llm gateway + - 300 models + - adapter --- OpenRouter is TanStack AI's first official AI partner and the recommended starting point for most projects. It provides access to 300+ models from OpenAI, Anthropic, Google, Meta, Mistral, and many more — all through a single API key and unified interface. diff --git a/docs/advanced/extend-adapter.md b/docs/advanced/extend-adapter.md index 5197e7476..a1317323d 100644 --- a/docs/advanced/extend-adapter.md +++ b/docs/advanced/extend-adapter.md @@ -2,6 +2,15 @@ title: Extend Adapter id: extend-adapter order: 7 +description: "Extend TanStack AI adapter factories with custom model IDs and fine-tuned models while keeping full type safety for input modalities and provider options." +keywords: + - tanstack ai + - extendAdapter + - custom models + - fine-tuned models + - createModel + - type safety + - adapter factory --- # Extending Adapters with Custom Models diff --git a/docs/advanced/middleware.md b/docs/advanced/middleware.md index 63a11898a..1e568f3a7 100644 --- a/docs/advanced/middleware.md +++ b/docs/advanced/middleware.md @@ -2,6 +2,16 @@ title: Middleware id: middleware order: 1 +description: "Hook into every stage of TanStack AI's chat() lifecycle with middleware — logging, analytics, stream transforms, tool interception, and side effects." +keywords: + - tanstack ai + - middleware + - chat middleware + - lifecycle hooks + - observability + - logging + - tool interception + - stream transform --- Middleware lets you hook into every stage of the `chat()` lifecycle — from configuration to streaming, tool execution, usage tracking, and completion. You can observe, transform, or short-circuit behavior at each stage without modifying your adapter or tool implementations. diff --git a/docs/advanced/multimodal-content.md b/docs/advanced/multimodal-content.md index 6e9e62652..5aa10b81a 100644 --- a/docs/advanced/multimodal-content.md +++ b/docs/advanced/multimodal-content.md @@ -2,6 +2,17 @@ title: Multimodal Content id: multimodal-content order: 3 +description: "Send images, audio, video, and documents alongside text in TanStack AI messages with typed ContentPart primitives for multimodal models." +keywords: + - tanstack ai + - multimodal + - vision + - images + - audio + - video + - documents + - ContentPart + - ImagePart --- TanStack AI supports multimodal content in messages, allowing you to send images, audio, video, and documents alongside text to AI models that support these modalities. diff --git a/docs/advanced/observability.md b/docs/advanced/observability.md index 95ebd79b0..b838dd973 100644 --- a/docs/advanced/observability.md +++ b/docs/advanced/observability.md @@ -2,6 +2,15 @@ title: Observability id: observability order: 2 +description: "Subscribe to TanStack AI events for observability and debugging — tool calls, streaming chunks, usage, and errors via the type-safe event client." +keywords: + - tanstack ai + - observability + - event client + - telemetry + - debugging + - tracing + - devtools --- # Event client diff --git a/docs/advanced/per-model-type-safety.md b/docs/advanced/per-model-type-safety.md index 5e1857183..b667ea0a9 100644 --- a/docs/advanced/per-model-type-safety.md +++ b/docs/advanced/per-model-type-safety.md @@ -2,6 +2,15 @@ title: Per-Model Type Safety id: per-model-type-safety order: 4 +description: "TanStack AI narrows modelOptions and content types to the specific model you select, enforcing capabilities at compile time." +keywords: + - tanstack ai + - type safety + - per-model types + - modelOptions + - typescript + - autocomplete + - compile-time --- The AI SDK provides **model-specific type safety** for `modelOptions`. Each model's capabilities determine which model options are allowed, and TypeScript will enforce this at compile time. diff --git a/docs/advanced/runtime-adapter-switching.md b/docs/advanced/runtime-adapter-switching.md index ece168460..a23d33f65 100644 --- a/docs/advanced/runtime-adapter-switching.md +++ b/docs/advanced/runtime-adapter-switching.md @@ -2,6 +2,14 @@ title: Runtime Adapter Switching id: runtime-adapter-switching order: 5 +description: "Let users switch between LLM providers at runtime in TanStack AI while keeping full TypeScript type safety for each adapter's model options." +keywords: + - tanstack ai + - runtime switching + - multi-provider + - adapter factory + - type safety + - dynamic adapter --- # Runtime Adapter Switching with Type Safety diff --git a/docs/advanced/tree-shaking.md b/docs/advanced/tree-shaking.md index d8aca8d81..c0ec0a89e 100644 --- a/docs/advanced/tree-shaking.md +++ b/docs/advanced/tree-shaking.md @@ -2,6 +2,14 @@ title: Tree-Shaking id: tree-shaking order: 6 +description: "TanStack AI's tree-shakeable architecture — import only the activities and adapters you use for minimal bundle size across chat, image, and speech." +keywords: + - tanstack ai + - tree-shaking + - bundle size + - modular imports + - performance + - tree-shakeable --- # Tree-Shaking & Bundle Optimization diff --git a/docs/api/ai-client.md b/docs/api/ai-client.md index f1a259bab..379e58589 100644 --- a/docs/api/ai-client.md +++ b/docs/api/ai-client.md @@ -2,6 +2,15 @@ title: "@tanstack/ai-client" slug: /api/ai-client order: 2 +description: "API reference for @tanstack/ai-client — the framework-agnostic headless client for managing chat state and streaming transports." +keywords: + - tanstack ai + - "@tanstack/ai-client" + - headless client + - ChatClient + - chat state + - connection adapters + - api reference --- Framework-agnostic headless client for managing chat state and streaming. diff --git a/docs/api/ai-preact.md b/docs/api/ai-preact.md index 2dd31863c..91ae5b7f5 100644 --- a/docs/api/ai-preact.md +++ b/docs/api/ai-preact.md @@ -2,6 +2,14 @@ title: "@tanstack/ai-preact" slug: /api/ai-preact order: 5 +description: "API reference for @tanstack/ai-preact — Preact hooks including useChat for streaming chat with full type safety in Preact apps." +keywords: + - tanstack ai + - "@tanstack/ai-preact" + - preact + - useChat + - preact hooks + - api reference --- Preact hooks for TanStack AI, providing convenient Preact bindings for the headless client. diff --git a/docs/api/ai-react.md b/docs/api/ai-react.md index e4b04c55f..c736e207a 100644 --- a/docs/api/ai-react.md +++ b/docs/api/ai-react.md @@ -2,6 +2,14 @@ title: "@tanstack/ai-react" slug: /api/ai-react order: 3 +description: "API reference for @tanstack/ai-react — React hooks including useChat for streaming chat with full type safety in React apps." +keywords: + - tanstack ai + - "@tanstack/ai-react" + - react + - useChat + - react hooks + - api reference --- React hooks for TanStack AI, providing convenient React bindings for the headless client. diff --git a/docs/api/ai-solid.md b/docs/api/ai-solid.md index 12c5ece16..daaade764 100644 --- a/docs/api/ai-solid.md +++ b/docs/api/ai-solid.md @@ -2,6 +2,15 @@ title: "@tanstack/ai-solid" slug: /api/ai-solid order: 4 +description: "API reference for @tanstack/ai-solid — SolidJS primitives including useChat for streaming chat with full type safety." +keywords: + - tanstack ai + - "@tanstack/ai-solid" + - solidjs + - solid + - useChat + - solid primitives + - api reference --- SolidJS primitives for TanStack AI, providing convenient SolidJS bindings for the headless client. diff --git a/docs/api/ai-svelte.md b/docs/api/ai-svelte.md index f67a3c1e3..09e434032 100644 --- a/docs/api/ai-svelte.md +++ b/docs/api/ai-svelte.md @@ -2,6 +2,15 @@ title: "@tanstack/ai-svelte" id: ai-svelte order: 6 +description: "API reference for @tanstack/ai-svelte — Svelte 5 reactive factory functions for streaming chat built on runes." +keywords: + - tanstack ai + - "@tanstack/ai-svelte" + - svelte + - svelte 5 + - createChat + - runes + - api reference --- Svelte 5 bindings for TanStack AI, providing reactive factory functions for the headless client using Svelte runes. diff --git a/docs/api/ai-vue.md b/docs/api/ai-vue.md index 68e566aea..be3d6f681 100644 --- a/docs/api/ai-vue.md +++ b/docs/api/ai-vue.md @@ -2,6 +2,15 @@ title: "@tanstack/ai-vue" id: ai-vue order: 5 +description: "API reference for @tanstack/ai-vue — Vue 3 composables including useChat for streaming chat with full type safety." +keywords: + - tanstack ai + - "@tanstack/ai-vue" + - vue + - vue 3 + - useChat + - composables + - api reference --- Vue composables for TanStack AI, providing convenient Vue 3 bindings for the headless client. diff --git a/docs/api/ai.md b/docs/api/ai.md index f98010cd0..da0970d14 100644 --- a/docs/api/ai.md +++ b/docs/api/ai.md @@ -2,6 +2,15 @@ title: "@tanstack/ai" id: tanstack-ai-api order: 1 +description: "API reference for @tanstack/ai — the core TanStack AI library providing chat(), generateImage(), toolDefinition(), and streaming utilities." +keywords: + - tanstack ai + - "@tanstack/ai" + - api reference + - chat + - toolDefinition + - generateImage + - core library --- The core AI library for TanStack AI. diff --git a/docs/architecture/approval-flow-processing.md b/docs/architecture/approval-flow-processing.md index 5f033a406..0441ea416 100644 --- a/docs/architecture/approval-flow-processing.md +++ b/docs/architecture/approval-flow-processing.md @@ -1,3 +1,18 @@ +--- +title: Approval Flow Processing Architecture +id: approval-flow-processing +description: "Internal architecture of TanStack AI's tool approval system — state machine, streaming protocol, concurrency control, and chained approval mechanics." +keywords: + - tanstack ai + - approval flow + - tool approval + - architecture + - state machine + - streaming protocol + - internals + - concurrency +--- + # Approval Flow Processing Architecture > Internal architecture reference for the tool approval system in TanStack AI. diff --git a/docs/chat/agentic-cycle.md b/docs/chat/agentic-cycle.md index d0627ba3f..bdabba19f 100644 --- a/docs/chat/agentic-cycle.md +++ b/docs/chat/agentic-cycle.md @@ -2,6 +2,14 @@ title: Agentic Cycle id: agentic-cycle order: 1 +description: "The agentic cycle in TanStack AI — how the LLM loops through tool calls, results, and reasoning until it produces a final answer." +keywords: + - tanstack ai + - agentic cycle + - agent loop + - tool calling + - multi-step reasoning + - ai agents --- The agentic cycle is the pattern where the LLM repeatedly calls tools, receives results, and continues reasoning until it can provide a final answer. This enables complex multi-step operations. diff --git a/docs/chat/connection-adapters.md b/docs/chat/connection-adapters.md index 70329125c..0c4460b2b 100644 --- a/docs/chat/connection-adapters.md +++ b/docs/chat/connection-adapters.md @@ -2,6 +2,15 @@ title: Connection Adapters id: connection-adapters order: 3 +description: "Connection adapters in TanStack AI bridge client and server for streaming chat responses — SSE, HTTP stream, direct async iterables, and custom transports." +keywords: + - tanstack ai + - connection adapters + - sse + - server-sent events + - http stream + - streaming transport + - fetchServerSentEvents --- diff --git a/docs/chat/streaming.md b/docs/chat/streaming.md index 2c799a772..a11bd2ca2 100644 --- a/docs/chat/streaming.md +++ b/docs/chat/streaming.md @@ -2,6 +2,15 @@ title: Streaming id: streaming-responses order: 2 +description: "Stream AI responses in real time with TanStack AI — async iterable chunks, chunk strategies, and partial JSON for responsive chat UIs." +keywords: + - tanstack ai + - streaming + - streaming responses + - real-time ai + - async iterable + - chunks + - partial json --- TanStack AI supports streaming responses for real-time chat experiences. Streaming allows you to display responses as they're generated, rather than waiting for the complete response. diff --git a/docs/chat/structured-outputs.md b/docs/chat/structured-outputs.md index fa25babc7..6bcd9c7a7 100644 --- a/docs/chat/structured-outputs.md +++ b/docs/chat/structured-outputs.md @@ -2,6 +2,16 @@ title: Structured Outputs id: structured-outputs order: 4 +description: "Constrain TanStack AI responses to a JSON Schema for typed, predictable structured output using Zod, Valibot, or any Standard Schema library." +keywords: + - tanstack ai + - structured outputs + - json schema + - zod + - valibot + - standard schema + - type-safe llm + - outputSchema --- Structured outputs allow you to constrain AI model responses to match a specific JSON schema, ensuring consistent and type-safe data extraction. TanStack AI uses the [Standard JSON Schema](https://standardschema.dev/) specification, allowing you to use any compatible schema library. diff --git a/docs/chat/thinking-content.md b/docs/chat/thinking-content.md index 72c1cd515..831f2ce4e 100644 --- a/docs/chat/thinking-content.md +++ b/docs/chat/thinking-content.md @@ -2,6 +2,16 @@ title: Thinking & Reasoning id: thinking-content order: 5 +description: "Render reasoning tokens from thinking models (Claude extended thinking, OpenAI o-series) as streamed ThinkingPart in TanStack AI chat UIs." +keywords: + - tanstack ai + - thinking + - reasoning + - extended thinking + - claude thinking + - o-series + - chain of thought + - ThinkingPart --- Some models expose their internal reasoning as "thinking" content -- Claude with extended thinking, OpenAI o-series models with reasoning, and others. TanStack AI captures this as `ThinkingPart` in messages, streamed to your UI in real-time alongside text and tool calls. diff --git a/docs/code-mode/client-integration.md b/docs/code-mode/client-integration.md index e91ad77aa..1a4a69c15 100644 --- a/docs/code-mode/client-integration.md +++ b/docs/code-mode/client-integration.md @@ -2,6 +2,15 @@ title: Showing Code Mode in the UI id: code-mode-client-integration order: 2 +description: "Stream Code Mode execution events to your React app — console output, external calls, and results as they happen, via onCustomEvent." +keywords: + - tanstack ai + - code mode + - react ui + - custom events + - onCustomEvent + - streaming ui + - execution progress --- You have [Code Mode](./code-mode) working on your server — the LLM writes and executes TypeScript, and you get results back. But your users see nothing while the sandbox runs. By the end of this guide, your React app will show real-time execution progress: console output, external function calls, and final results as they stream in. diff --git a/docs/code-mode/code-mode-isolates.md b/docs/code-mode/code-mode-isolates.md index ed1d44eb8..81c447fcd 100644 --- a/docs/code-mode/code-mode-isolates.md +++ b/docs/code-mode/code-mode-isolates.md @@ -2,6 +2,16 @@ title: Code Mode Isolate Drivers id: code-mode-isolates order: 4 +description: "Compare Code Mode sandbox drivers — Node isolated-vm, QuickJS WASM, and Cloudflare Workers — and choose the right runtime for your deployment." +keywords: + - tanstack ai + - code mode + - isolate driver + - isolated-vm + - quickjs + - cloudflare workers + - sandbox + - secure execution --- Isolate drivers provide the secure sandbox runtimes that [Code Mode](./code-mode.md) uses to execute generated TypeScript. All drivers implement the same `IsolateDriver` interface, so you can swap them without changing any other code. diff --git a/docs/code-mode/code-mode-with-skills.md b/docs/code-mode/code-mode-with-skills.md index 59c1e1f81..a77a0b736 100644 --- a/docs/code-mode/code-mode-with-skills.md +++ b/docs/code-mode/code-mode-with-skills.md @@ -2,10 +2,22 @@ title: Code Mode with Skills id: code-mode-with-skills order: 3 +description: "Teach Code Mode to save and reuse working code as named skills backed by persistent storage — faster follow-up requests and composable agent memory." +keywords: + - tanstack ai + - code mode + - skills + - skill library + - register_skill + - reusable snippets + - agent memory + - skill storage --- Skills extend [Code Mode](./code-mode.md) with a persistent library of reusable TypeScript snippets. When the LLM writes a useful piece of code — say, a function that fetches and ranks NPM packages — it can save that code as a _skill_. On future requests, relevant skills are loaded from storage and made available as first-class tools the LLM can call without re-writing the logic. +> **Different from agent-authoring skills.** The skills on this page are _runtime_ snippets the chat LLM saves and reuses. If you're looking to teach your coding assistant (Claude Code, Cursor, etc.) how TanStack AI itself works, see [Agent Skills (TanStack Intent)](../getting-started/agent-skills). + ## Overview The skills system has two integration paths: diff --git a/docs/code-mode/code-mode.md b/docs/code-mode/code-mode.md index 52a922866..aa2f9ddbe 100644 --- a/docs/code-mode/code-mode.md +++ b/docs/code-mode/code-mode.md @@ -2,6 +2,15 @@ title: Code Mode id: code-mode order: 1 +description: "Let LLMs write and execute TypeScript programs that orchestrate tools in a secure sandbox with TanStack AI Code Mode — fewer loops, richer logic." +keywords: + - tanstack ai + - code mode + - sandbox + - typescript execution + - tool orchestration + - execute_typescript + - ai agents --- Code Mode lets an LLM write and execute TypeScript programs inside a secure sandbox. Instead of making one tool call at a time, the model writes a short script that orchestrates multiple tools with loops, conditionals, `Promise.all`, and data transformations — then returns a single result. diff --git a/docs/community-adapters/cencori.md b/docs/community-adapters/cencori.md index f75b12f46..f1de31217 100644 --- a/docs/community-adapters/cencori.md +++ b/docs/community-adapters/cencori.md @@ -2,6 +2,15 @@ title: Cencori id: cencori-adapter order: 3 +description: "Access 14+ AI providers (OpenAI, Anthropic, Google, xAI, and more) through Cencori's unified interface with built-in security, observability, and cost tracking in TanStack AI." +keywords: + - tanstack ai + - cencori + - multi-provider + - observability + - cost tracking + - security + - community adapter --- The Cencori adapter provides access to 14+ AI providers (OpenAI, Anthropic, Google, xAI, and more) through a unified interface with built-in security, observability, and cost tracking. diff --git a/docs/community-adapters/cloudflare.md b/docs/community-adapters/cloudflare.md index 3c9001543..a34e54860 100644 --- a/docs/community-adapters/cloudflare.md +++ b/docs/community-adapters/cloudflare.md @@ -2,6 +2,16 @@ title: Cloudflare id: cloudflare-adapter order: 3 +description: "Use Cloudflare Workers AI and AI Gateway with TanStack AI for edge inference, caching, rate limiting, and unified billing across providers." +keywords: + - tanstack ai + - cloudflare + - workers ai + - ai gateway + - edge inference + - caching + - rate limiting + - community adapter --- The Cloudflare adapter provides access to [Cloudflare Workers AI](https://developers.cloudflare.com/workers-ai/) models and [AI Gateway](https://developers.cloudflare.com/ai-gateway/) for routing requests to OpenAI, Anthropic, Gemini, Grok, and OpenRouter with caching, rate limiting, and unified billing. diff --git a/docs/community-adapters/decart.md b/docs/community-adapters/decart.md index be61d401b..d7bd46321 100644 --- a/docs/community-adapters/decart.md +++ b/docs/community-adapters/decart.md @@ -2,6 +2,13 @@ title: Decart id: decart-adapter order: 2 +description: "Generate images and videos with Decart's AI models in TanStack AI via the Decart community adapter." +keywords: + - tanstack ai + - decart + - image generation + - video generation + - community adapter --- The Decart adapter provides access to Decart's image and video generation models. diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 221685649..3c85c3fae 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -1,7 +1,16 @@ ---- +--- title: "Community Adapters Guide" slug: /community-adapters/guide order: 1 +description: "Build and publish a community adapter for TanStack AI — package conventions, implementing the adapter interface, and publishing to npm." +keywords: + - tanstack ai + - community adapters + - build adapter + - custom adapter + - provider integration + - adapter authoring + - contribute --- # Community Adapters Guide diff --git a/docs/community-adapters/mynth.md b/docs/community-adapters/mynth.md index 6f2183a1d..d9a68b045 100644 --- a/docs/community-adapters/mynth.md +++ b/docs/community-adapters/mynth.md @@ -1,3 +1,18 @@ +--- +title: Mynth +id: mynth-adapter +description: "Generate images with Mynth models — Flux, Recraft, Gemini, Qwen, Seedream, Wan, and Grok Imagine — in TanStack AI via the Mynth community adapter." +keywords: + - tanstack ai + - mynth + - image generation + - flux + - recraft + - qwen + - seedream + - community adapter +--- + # Mynth > **Alpha:** Mynth is currently in public alpha. We are publishing TanStack AI adapters early to gather feedback on the API, supported models, and integration experience while the platform is still evolving. diff --git a/docs/community-adapters/soniox.md b/docs/community-adapters/soniox.md index e941c3e0b..77abce377 100644 --- a/docs/community-adapters/soniox.md +++ b/docs/community-adapters/soniox.md @@ -2,6 +2,14 @@ title: Soniox id: soniox-adapter order: 3 +description: "Transcribe audio with Soniox speech-to-text models in TanStack AI via the Soniox community adapter." +keywords: + - tanstack ai + - soniox + - transcription + - speech-to-text + - asr + - community adapter --- The Soniox adapter provides access to Soniox transcription models. diff --git a/docs/comparison/vercel-ai-sdk.md b/docs/comparison/vercel-ai-sdk.md index 02ce2b888..415a4731c 100644 --- a/docs/comparison/vercel-ai-sdk.md +++ b/docs/comparison/vercel-ai-sdk.md @@ -2,6 +2,16 @@ title: TanStack AI vs Vercel AI SDK id: vercel-ai-sdk order: 1 +description: "How TanStack AI compares to the Vercel AI SDK — feature matrix, philosophy, type safety, tool calling, streaming, and framework support." +keywords: + - tanstack ai + - vercel ai sdk + - comparison + - ai sdk + - alternatives + - typescript ai sdk + - tool calling + - llm --- Both TanStack AI and Vercel AI SDK are open-source TypeScript toolkits for building AI-powered applications. They share common ground - streaming chat, tool calling, multi-provider support, and deploy-anywhere flexibility - but they approach the problem from fundamentally different directions. diff --git a/docs/config.json b/docs/config.json index a4377a9af..698d18545 100644 --- a/docs/config.json +++ b/docs/config.json @@ -32,6 +32,10 @@ { "label": "Quick Start: Server Only", "to": "getting-started/quick-start-server" + }, + { + "label": "Agent Skills (TanStack Intent)", + "to": "getting-started/agent-skills" } ] }, diff --git a/docs/getting-started/agent-skills.md b/docs/getting-started/agent-skills.md new file mode 100644 index 000000000..a3441133d --- /dev/null +++ b/docs/getting-started/agent-skills.md @@ -0,0 +1,109 @@ +--- +title: Agent Skills (TanStack Intent) +id: agent-skills +order: 6 +description: "Use TanStack Intent to wire TanStack AI's bundled Agent Skills into Claude Code, Cursor, GitHub Copilot, and other AI coding assistants." +keywords: + - tanstack ai + - tanstack intent + - agent skills + - claude code + - cursor + - github copilot + - ai coding agents + - SKILL.md + - AGENTS.md +--- + +You're building with TanStack AI and using an AI coding agent — Claude Code, Cursor, GitHub Copilot, or similar. The agent keeps suggesting Vercel-AI-SDK patterns like `streamText()` or `createOpenAI()`, or it wires streams manually instead of using `toServerSentEventsResponse()`. By the end of this guide, your agent will load TanStack AI's bundled skills automatically whenever you work on AI code — and those skills will stay in sync with whichever `@tanstack/ai` version your project installs. + +> **Looking for runtime skills inside Code Mode?** Those are a different feature — see [Code Mode with Skills](../code-mode/code-mode-with-skills). This page is about _agent-authoring_ skills: markdown files that teach your coding assistant how TanStack AI works. + +## What are Agent Skills? + +Agent Skills are markdown documents (`SKILL.md`) that ship inside npm packages and tell AI coding agents how to use a library correctly — which functions to use, which patterns to avoid, and when to reach for which module. The format is an open standard supported by Claude Code, Cursor, GitHub Copilot, Codex, and others. + +TanStack AI publishes skills inside its packages so the guidance travels with `npm update` instead of being pinned in a model's training data or copy-pasted into `CLAUDE.md` manually. + +## Skills Shipped by TanStack AI + +| Package | Skill | What it teaches | +|---------|-------|-----------------| +| `@tanstack/ai` | `ai-core` | Chat experience, tool calling, adapters, middleware, structured outputs, media generation, AG-UI protocol, custom backends | +| `@tanstack/ai-code-mode` | `ai-code-mode` | Setting up Code Mode with a sandbox driver and registering server tools | + +Each skill lives under `node_modules//skills//SKILL.md` once the package is installed. + +## Step 1: Install TanStack AI + +If you haven't already, install `@tanstack/ai` plus any adapter packages you need. See the [Quick Start](./quick-start) for a full walkthrough. + +```bash +pnpm add @tanstack/ai @tanstack/ai-openai +``` + +## Step 2: Run `intent install` + +From the root of your project, run: + +```bash +npx @tanstack/intent@latest install +``` + +The CLI walks your agent through the setup. It scans `node_modules` for every package that ships skills (any package with the `tanstack-intent` keyword), asks your agent to propose task-to-skill mappings that match your codebase, and writes them into your agent's config file. + +By default the mappings land in `AGENTS.md`. The CLI can also target: + +- `CLAUDE.md` — Claude Code +- `.cursorrules` — Cursor +- any other agent config file you point it at + +## Step 3: Review the Generated Mappings + +The install command appends (or creates) an `intent-skills` block that looks like this: + +```yaml + +# Skill mappings — when working in these areas, load the linked skill file into context. +skills: + - task: "Building chat, tool calling, adapters, or streaming with TanStack AI" + load: "node_modules/@tanstack/ai/skills/ai-core/SKILL.md" + - task: "Setting up Code Mode with TanStack AI" + load: "node_modules/@tanstack/ai-code-mode/skills/ai-code-mode/SKILL.md" + +``` + +Check that the `task:` descriptions match areas you actually work in. Tighten or reword them if needed — they're how your agent decides when to pull the skill into context. + +## Step 4: Confirm It's Wired Up + +Open a fresh session in your coding agent and ask it to build something with TanStack AI — for example: _"Add a streaming chat endpoint using `@tanstack/ai` and the OpenAI adapter."_ + +You should see: + +- The agent uses `chat()`, not `streamText()`. +- The adapter is imported as `openaiText()` from `@tanstack/ai-openai`, not `createOpenAI()`. +- The response is wrapped with `toServerSentEventsResponse()` instead of manual SSE wiring. +- Middleware is used for lifecycle events (no `onFinish` callback on `chat()`). + +If the agent still falls back to other-SDK patterns, re-open its config file and confirm the `intent-skills` block is present and the `task:` descriptions clearly cover the area you're asking about. + +## Keeping Skills Current + +Skills are versioned with the package. When you bump `@tanstack/ai`, the `SKILL.md` files under `node_modules` update with it — no CLI re-run needed. Re-run `npx @tanstack/intent@latest install` only when you _add_ a new intent-enabled package (for example, adding `@tanstack/ai-code-mode` later) or want to refresh the task mappings. + +## Using Skills Without the CLI + +If you'd rather wire skills in yourself, you can reference them directly from `node_modules` in any agent config file. The minimum your agent needs is a pointer to the file: + +```markdown +When working on TanStack AI code, read and follow: +node_modules/@tanstack/ai/skills/ai-core/SKILL.md +``` + +The CLI is recommended because it discovers packages automatically and stays consistent with the agent-skills standard, but the underlying file paths are stable. + +## Learn More + +- [TanStack Intent documentation](https://tanstack.com/intent/latest/docs/overview) — the CLI's full reference, including `scaffold`, `validate`, and CI setup for library maintainers. +- [Agent Skills registry](https://tanstack.com/intent/registry) — browse other intent-enabled packages. diff --git a/docs/getting-started/devtools.md b/docs/getting-started/devtools.md index 83d7b10c6..5178dbef0 100644 --- a/docs/getting-started/devtools.md +++ b/docs/getting-started/devtools.md @@ -2,6 +2,15 @@ title: Devtools id: devtools order: 3 +description: "Inspect and debug TanStack AI apps with the TanStack Devtools panel — live chat messages, tool call inputs and outputs, state, and errors." +keywords: + - tanstack ai + - devtools + - debugging + - tool inspection + - chat inspector + - react devtools + - observability --- TanStack Devtools is a unified devtools panel for inspecting and debugging TanStack libraries, including TanStack AI. It provides real-time insights into AI interactions, tool calls, and state changes, making it easier to develop and troubleshoot AI-powered applications. diff --git a/docs/getting-started/overview.md b/docs/getting-started/overview.md index 84447eba8..0523af39b 100644 --- a/docs/getting-started/overview.md +++ b/docs/getting-started/overview.md @@ -2,6 +2,16 @@ title: Overview id: overview order: 1 +description: "TanStack AI is a type-safe, provider-agnostic TypeScript SDK for building streaming chat, tool calling, and AI features that work across any framework." +keywords: + - tanstack ai + - ai sdk + - typescript ai + - streaming chat + - tool calling + - isomorphic tools + - framework agnostic + - llm sdk --- TanStack AI is a lightweight, type-safe SDK for building production-ready AI experiences. Its framework-agnostic core provides type-safe tool/function calling, streaming responses, and first-class React and Solid integrations, with adapters for multiple LLM providers — enabling predictable, composable, and testable AI features across any stack. diff --git a/docs/getting-started/quick-start-server.md b/docs/getting-started/quick-start-server.md index 96bd272f6..6ef617671 100644 --- a/docs/getting-started/quick-start-server.md +++ b/docs/getting-started/quick-start-server.md @@ -2,6 +2,16 @@ title: "Quick Start: Server Only" id: quick-start-server order: 5 +description: "Add a streaming AI chat endpoint to a Node.js backend with TanStack AI — no UI framework required." +keywords: + - tanstack ai + - node.js + - server + - backend + - quick start + - streaming chat + - openai + - sse --- You have a Node.js backend and want to add AI capabilities. By the end of this guide, you'll have a working chat endpoint powered by TanStack AI and OpenAI -- no UI framework required. diff --git a/docs/getting-started/quick-start-svelte.md b/docs/getting-started/quick-start-svelte.md index 7e2e06df7..0d3b6b14d 100644 --- a/docs/getting-started/quick-start-svelte.md +++ b/docs/getting-started/quick-start-svelte.md @@ -2,6 +2,16 @@ title: "Quick Start: Svelte" id: quick-start-svelte order: 4 +description: "Add a streaming TanStack AI chat component to a SvelteKit app using Svelte 5 runes and the OpenAI adapter." +keywords: + - tanstack ai + - svelte + - sveltekit + - svelte 5 + - quick start + - streaming chat + - openai + - runes --- You have a SvelteKit app and want to add AI chat. By the end of this guide, you'll have a streaming chat component powered by TanStack AI and OpenAI. diff --git a/docs/getting-started/quick-start-vue.md b/docs/getting-started/quick-start-vue.md index dc092045b..23547fbcc 100644 --- a/docs/getting-started/quick-start-vue.md +++ b/docs/getting-started/quick-start-vue.md @@ -2,6 +2,16 @@ title: "Quick Start: Vue" id: quick-start-vue order: 3 +description: "Build a streaming TanStack AI chat component in a Vue 3 app using the useChat composable and the OpenAI adapter." +keywords: + - tanstack ai + - vue + - vue 3 + - quick start + - useChat + - streaming chat + - openai + - composable --- You have a Vue 3 app and want to add AI chat. By the end of this guide, you'll have a streaming chat component powered by TanStack AI and OpenAI. diff --git a/docs/getting-started/quick-start.md b/docs/getting-started/quick-start.md index 02fc29aa3..fb5e92577 100644 --- a/docs/getting-started/quick-start.md +++ b/docs/getting-started/quick-start.md @@ -2,6 +2,16 @@ title: "Quick Start: React" id: quick-start order: 2 +description: "Add a streaming TanStack AI chat to a React app in minutes using the useChat hook and the OpenAI adapter." +keywords: + - tanstack ai + - react + - quick start + - useChat + - streaming chat + - openai + - tutorial + - ai chatbot --- Get started with TanStack AI in minutes. This guide will walk you through creating a simple chat application using the React integration and OpenAI adapter. diff --git a/docs/media/generation-hooks.md b/docs/media/generation-hooks.md index e4252a676..81d1b4cc6 100644 --- a/docs/media/generation-hooks.md +++ b/docs/media/generation-hooks.md @@ -2,6 +2,16 @@ title: Generation Hooks id: generation-hooks order: 7 +description: "Framework hooks for every TanStack AI media generation type — useGenerateImage, useGenerateSpeech, useTranscription, useSummarize, useGenerateVideo." +keywords: + - tanstack ai + - generation hooks + - useGenerateImage + - useGenerateSpeech + - useTranscription + - useSummarize + - useGenerateVideo + - react hooks --- # Generation Hooks diff --git a/docs/media/generations.md b/docs/media/generations.md index ffeebe6ac..cc94a28db 100644 --- a/docs/media/generations.md +++ b/docs/media/generations.md @@ -2,6 +2,16 @@ title: Generations id: generations order: 1 +description: "The unified pattern for non-chat activities in TanStack AI — image generation, text-to-speech, transcription, summarization, and video." +keywords: + - tanstack ai + - generations + - media generation + - image generation + - transcription + - tts + - summarization + - video generation --- # Generations diff --git a/docs/media/image-generation.md b/docs/media/image-generation.md index 00b99cefa..6c3c6e115 100644 --- a/docs/media/image-generation.md +++ b/docs/media/image-generation.md @@ -2,6 +2,16 @@ title: Image Generation id: image-generation order: 5 +description: "Generate images with OpenAI DALL-E, Gemini NanoBanana and Imagen, and fal.ai models via TanStack AI's unified generateImage() API." +keywords: + - tanstack ai + - image generation + - generateImage + - dall-e + - imagen + - nano banana + - flux + - fal.ai --- # Image Generation diff --git a/docs/media/realtime-chat.md b/docs/media/realtime-chat.md index 51039c66f..d67eee05e 100644 --- a/docs/media/realtime-chat.md +++ b/docs/media/realtime-chat.md @@ -2,6 +2,17 @@ title: Realtime Voice Chat id: realtime-chat order: 2 +description: "Build realtime voice-to-voice AI chat with TanStack AI — WebRTC and WebSocket, voice activity detection, interruptions, and multimodal input." +keywords: + - tanstack ai + - realtime voice + - voice chat + - webrtc + - websocket + - vad + - voice ai + - multimodal + - useRealtimeChat --- TanStack AI provides a complete realtime voice chat system for building voice-to-voice AI interactions. The realtime API supports multiple providers (OpenAI, ElevenLabs), automatic tool execution, audio visualization, and multimodal input including images. diff --git a/docs/media/text-to-speech.md b/docs/media/text-to-speech.md index c90265444..18040e963 100644 --- a/docs/media/text-to-speech.md +++ b/docs/media/text-to-speech.md @@ -2,6 +2,15 @@ title: Text-to-Speech id: text-to-speech order: 3 +description: "Convert text to spoken audio with OpenAI TTS and Gemini voice models via TanStack AI's generateSpeech() API." +keywords: + - tanstack ai + - text-to-speech + - tts + - generateSpeech + - openai tts + - voice synthesis + - speech generation --- # Text-to-Speech (TTS) diff --git a/docs/media/transcription.md b/docs/media/transcription.md index a4cf0ad7e..3ef7cfe06 100644 --- a/docs/media/transcription.md +++ b/docs/media/transcription.md @@ -2,6 +2,15 @@ title: Transcription id: transcription order: 4 +description: "Transcribe audio to text with OpenAI Whisper and GPT-4o-transcribe via TanStack AI's generateTranscription() API." +keywords: + - tanstack ai + - transcription + - speech-to-text + - asr + - whisper + - generateTranscription + - openai --- # Audio Transcription diff --git a/docs/media/video-generation.md b/docs/media/video-generation.md index 4088121a6..b42e88b6b 100644 --- a/docs/media/video-generation.md +++ b/docs/media/video-generation.md @@ -2,6 +2,15 @@ title: Video Generation id: video-generation order: 6 +description: "Generate video from text prompts with OpenAI Sora using TanStack AI's experimental generateVideo() jobs/polling API." +keywords: + - tanstack ai + - video generation + - sora + - generateVideo + - jobs api + - experimental + - text-to-video --- # Video Generation (Experimental) diff --git a/docs/migration/migration.md b/docs/migration/migration.md index 2a1093b47..2c8db8391 100644 --- a/docs/migration/migration.md +++ b/docs/migration/migration.md @@ -2,6 +2,15 @@ title: Migration Guide id: migration order: 1 +description: "Migrate existing TanStack AI code to the latest version — adapter function splits, flattened options, renamed modelOptions, and removed embeddings." +keywords: + - tanstack ai + - migration + - upgrade + - breaking changes + - tree-shaking + - modelOptions + - toServerSentEventsStream --- # Migration Guide diff --git a/docs/protocol/chunk-definitions.md b/docs/protocol/chunk-definitions.md index 4c9ccdb30..3b24b9207 100644 --- a/docs/protocol/chunk-definitions.md +++ b/docs/protocol/chunk-definitions.md @@ -1,6 +1,15 @@ --- title: AG-UI Event Definitions id: chunk-definitions +description: "TanStack AI implements the AG-UI protocol — full event definitions, types, and streaming semantics for agent-to-UI communication." +keywords: + - tanstack ai + - ag-ui + - ag-ui protocol + - events + - stream chunks + - streaming protocol + - agent protocol --- TanStack AI implements the [AG-UI (Agent-User Interaction) Protocol](https://docs.ag-ui.com/introduction), an open, lightweight, event-based protocol that standardizes how AI agents connect to user-facing applications. diff --git a/docs/protocol/http-stream-protocol.md b/docs/protocol/http-stream-protocol.md index b632c503a..0318557bb 100644 --- a/docs/protocol/http-stream-protocol.md +++ b/docs/protocol/http-stream-protocol.md @@ -1,6 +1,14 @@ --- title: HTTP Stream Protocol id: http-stream-protocol +description: "TanStack AI's HTTP streaming protocol spec using newline-delimited JSON (NDJSON) — an alternative to SSE for simpler line-based transport." +keywords: + - tanstack ai + - http stream + - ndjson + - newline-delimited json + - streaming protocol + - protocol spec --- HTTP streaming with newline-delimited JSON (NDJSON) is a simpler protocol than SSE that sends one JSON object per line. It's useful when: diff --git a/docs/protocol/sse-protocol.md b/docs/protocol/sse-protocol.md index 52b57beeb..1fec1b74a 100644 --- a/docs/protocol/sse-protocol.md +++ b/docs/protocol/sse-protocol.md @@ -1,6 +1,14 @@ --- title: Server-Sent Events (SSE) Protocol id: sse-protocol +description: "TanStack AI's Server-Sent Events protocol spec — the recommended streaming transport for chat and media generations, with auto-reconnection." +keywords: + - tanstack ai + - sse + - server-sent events + - streaming protocol + - protocol spec + - eventsource --- Server-Sent Events (SSE) is a standard HTTP-based protocol for server-to-client streaming. It provides: diff --git a/docs/tools/client-tools.md b/docs/tools/client-tools.md index 64f572b3f..f65cde7b6 100644 --- a/docs/tools/client-tools.md +++ b/docs/tools/client-tools.md @@ -2,6 +2,15 @@ title: Client Tools id: client-tools order: 4 +description: "Client tools in TanStack AI run in the browser for UI updates, localStorage, and browser API access with type-safe onToolCall handling." +keywords: + - tanstack ai + - client tools + - browser tools + - ui tools + - onToolCall + - clientTools + - localStorage --- Client tools execute in the browser, enabling UI updates, local storage access, and browser API interactions. Unlike server tools, client tools don't have an `execute` function in their server definition. diff --git a/docs/tools/lazy-tool-discovery.md b/docs/tools/lazy-tool-discovery.md index 55b339a5c..70625dfbc 100644 --- a/docs/tools/lazy-tool-discovery.md +++ b/docs/tools/lazy-tool-discovery.md @@ -2,6 +2,15 @@ title: Lazy Tool Discovery id: lazy-tool-discovery order: 6 +description: "Reduce token cost in tool-heavy TanStack AI apps with lazy tool discovery — the LLM discovers only the tools it needs for the current task." +keywords: + - tanstack ai + - lazy tools + - tool discovery + - token optimization + - context optimization + - performance + - large tool sets --- When an application has many tools, sending all tool definitions to the LLM on every request wastes tokens and can degrade response quality. Lazy tool discovery lets the LLM selectively discover only the tools it needs for the current task. diff --git a/docs/tools/server-tools.md b/docs/tools/server-tools.md index bcae69ecf..69bf1552d 100644 --- a/docs/tools/server-tools.md +++ b/docs/tools/server-tools.md @@ -2,6 +2,14 @@ title: Server Tools id: server-tools order: 3 +description: "Server tools in TanStack AI execute automatically with full access to databases, APIs, and environment variables. Patterns, examples, and security." +keywords: + - tanstack ai + - server tools + - function calling + - backend tools + - tool execute + - database access --- Server tools execute automatically when called by the LLM. They have full access to server resources like databases, APIs, and environment variables. diff --git a/docs/tools/tool-approval.md b/docs/tools/tool-approval.md index 28a234706..67c597fa4 100644 --- a/docs/tools/tool-approval.md +++ b/docs/tools/tool-approval.md @@ -2,6 +2,15 @@ title: Tool Approval Flow id: tool-approval-flow order: 5 +description: "Require user approval before executing sensitive tools in TanStack AI — approval states, deny flows, and batched approvals with needsApproval." +keywords: + - tanstack ai + - tool approval + - needsApproval + - user consent + - sensitive tools + - approval flow + - human-in-the-loop --- The tool approval flow allows you to require user approval before executing sensitive tools, giving users control over actions like sending emails, making purchases, or deleting data. Tools go through these states during approval: diff --git a/docs/tools/tool-architecture.md b/docs/tools/tool-architecture.md index 175637d26..3e7dbf0a7 100644 --- a/docs/tools/tool-architecture.md +++ b/docs/tools/tool-architecture.md @@ -2,6 +2,15 @@ title: Tool Architecture id: tool-architecture order: 2 +description: "The architecture behind TanStack AI's tool system — server tools, client tools, call states, approval flow, and the agentic cycle." +keywords: + - tanstack ai + - tool architecture + - server tools + - client tools + - call states + - approval flow + - agentic cycle --- The TanStack AI tool system provides a powerful, flexible architecture for enabling AI agents to interact with external systems: diff --git a/docs/tools/tools.md b/docs/tools/tools.md index c0a651a95..cba08c5ad 100644 --- a/docs/tools/tools.md +++ b/docs/tools/tools.md @@ -2,6 +2,16 @@ title: Tools id: tools order: 1 +description: "Define isomorphic AI tools in TanStack AI with toolDefinition() for type-safe server- and client-side function calling across any framework." +keywords: + - tanstack ai + - tools + - function calling + - toolDefinition + - isomorphic tools + - server tools + - client tools + - type safety --- Tools (also called "function calling") allow AI models to interact with external systems, APIs, or perform computations. TanStack AI provides an isomorphic tool system that enables type-safe, framework-agnostic tool definitions that work on both server and client.