Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
72c117d
refactor: migrate ai-groq + ai-openrouter onto @tanstack/openai-base …
tombeckenham May 11, 2026
1c8e1f4
ci: apply automated fixes
autofix-ci[bot] May 11, 2026
b320df5
fix(openai-base, ai-openrouter, ai): silent failures in chat-completi…
tombeckenham May 11, 2026
d9a74c4
feat(ai-openrouter, openai-base): OpenRouter Responses (beta) adapter
tombeckenham May 12, 2026
d741f2f
chore(ai-groq): remove dead unused message-param types
tombeckenham May 12, 2026
f66f82f
fix(ai-openrouter): pass UNKNOWN-fallback events through verbatim
tombeckenham May 12, 2026
d5e492d
refactor(adapters): remove asChunk casts, enforce satisfies StreamChunk
tombeckenham May 12, 2026
50214f7
fix(ai-openrouter): preserve assistant/tool message content fidelity
AlemTuzlak May 12, 2026
e8cce25
fix(ai-groq): correct ChatCompletionNamedToolChoice shape
AlemTuzlak May 12, 2026
ba9936e
test(ai-groq): reset pendingMockCreate between tests
AlemTuzlak May 12, 2026
74cbd77
test(e2e): route OpenRouter summarize through createOpenRouterSummarize
AlemTuzlak May 12, 2026
0ecfd3b
chore(ai-openrouter): declare zod as peer dependency
AlemTuzlak May 12, 2026
5eb7aa7
fix(ai-groq): drop spurious timestamp field from processStreamChunks …
AlemTuzlak May 12, 2026
a773bd5
fix(ai-openrouter): stringify error.code on response.failed events
AlemTuzlak May 12, 2026
993df3e
fix(ai-openrouter): default image data URI mime type to octet-stream
AlemTuzlak May 12, 2026
6a9ce76
fix(openai-base): stop processing chunks after top-level error event
AlemTuzlak May 12, 2026
06dd544
fix(openai-base, ai-openrouter): route Responses structuredOutput thr…
AlemTuzlak May 12, 2026
39c927b
fix(ai-openrouter): extract text from array-shaped tool message content
AlemTuzlak May 12, 2026
335adaf
chore(ai-groq): declare @tanstack/ai as workspace devDependency
AlemTuzlak May 12, 2026
7bb2b82
fix(ai-openrouter): route audio URLs to text fallback on chat-complet…
AlemTuzlak May 12, 2026
272fe5f
docs(ai-groq): correct message-types header β€” Groq SDK was dropped
AlemTuzlak May 12, 2026
2bc993c
fix(ai-openrouter): reject inline document data on chat-completions
AlemTuzlak May 12, 2026
9d6a1e8
refactor: rename @tanstack/openai-base β†’ @tanstack/openai-compatible
tombeckenham May 13, 2026
091daa6
ci: apply automated fixes
autofix-ci[bot] May 13, 2026
e90c8d9
refactor: rename @tanstack/openai-compatible β†’ @tanstack/ai-openai-co…
tombeckenham May 13, 2026
813e296
docs(ai-openai-compatible, ai-openrouter): explain the protocol-vs-pr…
tombeckenham May 13, 2026
c4070ae
ci: apply automated fixes
autofix-ci[bot] May 13, 2026
72c8aea
docs(adapters/openrouter): add Chat Completions vs Responses (beta) s…
tombeckenham May 13, 2026
dad9e55
refactor(ai-openai-compatible): narrow to chat/responses; decouple fr…
tombeckenham May 13, 2026
62aad90
ci: apply automated fixes
autofix-ci[bot] May 13, 2026
ebd6244
refactor(ai): rename chat-stream-wrapper to chat-stream-summarize
tombeckenham May 13, 2026
e0dcb77
refactor(summarize): unify provider summarize adapters on chat-stream…
tombeckenham May 13, 2026
0db4c12
ci: apply automated fixes
autofix-ci[bot] May 13, 2026
a39e2bc
refactor(ai-openai-compatible): vendor wire types; drop openai dep
tombeckenham May 13, 2026
71cf0f4
ci: apply automated fixes
autofix-ci[bot] May 13, 2026
7aff8b1
refactor(openai-base): rename, adopt openai SDK, decouple ai-openrouter
tombeckenham May 13, 2026
b566897
ci: apply automated fixes
autofix-ci[bot] May 13, 2026
20e8397
Corrected package versions
tombeckenham May 13, 2026
44db925
refactor(adapters): address PR review β€” jsdoc, casts, zod, finishReason
AlemTuzlak May 13, 2026
f473e44
refactor(ai-openrouter): drop residual chunk casts in responses-text
tombeckenham May 13, 2026
ad91033
ci: apply automated fixes
autofix-ci[bot] May 13, 2026
6d99fad
refactor(ai): tighten summarize TProviderOptions to Record<string, un…
tombeckenham May 13, 2026
c92b351
ci: apply automated fixes
autofix-ci[bot] May 13, 2026
6b00d53
docs(openai-base): rewrite README; consolidate summarize changeset
tombeckenham May 13, 2026
ff282fe
fix(ai): revert summarize TProviderOptions constraint to extends object
tombeckenham May 13, 2026
9f9b746
ci: apply automated fixes
autofix-ci[bot] May 13, 2026
35618ca
refactor(ai): drop the SummarizationOptions<object> annotation noise
tombeckenham May 14, 2026
60a1302
fix(ai, ai-client): replace removed StreamChunk casts with typed even…
tombeckenham May 14, 2026
d205b23
ci: apply automated fixes
autofix-ci[bot] May 14, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 35 additions & 0 deletions .changeset/decouple-openrouter-collapse-openai-base.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
---
'@tanstack/openai-base': minor
'@tanstack/ai-openai': patch
'@tanstack/ai-grok': patch
'@tanstack/ai-groq': patch
'@tanstack/ai-openrouter': patch
---

Decouple `@tanstack/ai-openrouter` from the shared OpenAI base, and collapse the base into a thinner shim over the `openai` SDK.

Three changes that ship together:

**1. Rename `@tanstack/ai-openai-compatible` β†’ `@tanstack/openai-base`.** The previous name implied a multi-vendor protocol surface. After ai-openrouter is decoupled (see below), the only remaining consumers (`ai-openai`, `ai-grok`, `ai-groq`) all back onto the `openai` SDK with a different `baseURL` β€” "base" describes that role accurately. Imports change:

```diff
- import { OpenAICompatibleChatCompletionsTextAdapter } from '@tanstack/ai-openai-compatible'
+ import { OpenAIBaseChatCompletionsTextAdapter } from '@tanstack/openai-base'
- import { OpenAICompatibleResponsesTextAdapter } from '@tanstack/ai-openai-compatible'
+ import { OpenAIBaseResponsesTextAdapter } from '@tanstack/openai-base'
```

`@tanstack/ai-openai-compatible@0.2.x` remains published for anyone with a pinned lockfile reference but will receive no further updates.

**2. `@tanstack/openai-base` adopts the `openai` SDK directly.** The previous package vendored ~720 LOC of hand-written wire-format types (`ChatCompletion`, `ResponseStreamEvent`, etc.) and exposed abstract `callChatCompletion*` / `callResponse*` hooks subclasses had to implement. Both are gone:

- The base now depends on `openai` again and imports types directly from `openai/resources/...`. The vendored `src/types/` directory is removed; consumers that imported wire types from the package (e.g. `import type { ResponseInput } from '@tanstack/ai-openai-compatible'`) should now import from the openai SDK.
- The abstract SDK-call methods are removed. The base constructor takes a pre-built `OpenAI` client (`new OpenAIBaseChatCompletionsTextAdapter(model, name, openaiClient)`) and calls `client.chat.completions.create` / `client.responses.create` itself. Subclasses (`ai-openai`, `ai-grok`, `ai-groq`) now just construct the SDK with their provider-specific `baseURL` and pass it to `super` β€” `callChatCompletion*` / `callResponse*` overrides go away.

The other extension hooks (`extractReasoning`, `extractTextFromResponse`, `processStreamChunks`, `makeStructuredOutputCompatible`, `transformStructuredOutput`, `mapOptionsToRequest`, `convertMessage`) remain. Groq's `processStreamChunks` and `makeStructuredOutputCompatible` overrides (for `x_groq.usage` promotion and Groq's structured-output schema quirks) are unchanged.

**3. Decouple `@tanstack/ai-openrouter` from the OpenAI base entirely.** OpenRouter ships its own SDK (`@openrouter/sdk`) with a camelCase shape, so inheriting from the OpenAI-shaped base forced a snake_case ↔ camelCase round-trip on every request and stream event. ai-openrouter now extends `BaseTextAdapter` directly and inlines its own stream processors (`OpenRouterTextAdapter` for chat-completions, `OpenRouterResponsesTextAdapter` for the Responses beta), reading OpenRouter's camelCase types natively. The `@tanstack/openai-base` and `openai` dependencies are removed from ai-openrouter; only `@openrouter/sdk`, `@tanstack/ai`, and `@tanstack/ai-utils` remain.

Public API is unchanged: `openRouterText`, `openRouterResponsesText`, `createOpenRouterText`, `createOpenRouterResponsesText`, the OpenRouter tool factories, provider routing surface (`provider`, `models`, `plugins`, `variant`, `transforms`), app attribution headers (`httpReferer`, `appTitle`), `:variant` model suffixing, `RequestAbortedError` propagation, and the OpenRouter-specific structured-output null-preservation all behave the same. The ~300 LOC of inbound/outbound shape converters (`toOpenRouterRequest`, `toChatCompletion`, `adaptOpenRouterStreamChunks`, `toSnakeResponseResult`, …) are gone.

`ai-ollama` remains on `BaseTextAdapter` directly β€” its native API uses a different wire format from Chat Completions and was never on the shared base.
5 changes: 5 additions & 0 deletions .changeset/openrouter-narrow-stream-chunk-types.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'@tanstack/ai-openrouter': patch
---

Internal: drop the remaining duck-typed `as { ... }` casts on stream chunks in `OpenRouterResponsesTextAdapter`. Five sites (`response.created/in_progress/incomplete/failed` model + error capture, `response.content_part.added/done` payload, and the `response.completed` function-call detection) now narrow via the SDK's discriminated unions directly. Behaviourally identical; reduces the chance of a SDK type rename silently slipping past us.
23 changes: 23 additions & 0 deletions .changeset/summarize-unify-on-chat-stream-wrapper.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
---
'@tanstack/ai': patch
'@tanstack/ai-anthropic': patch
'@tanstack/ai-gemini': patch
'@tanstack/ai-grok': patch
'@tanstack/ai-ollama': patch
'@tanstack/ai-openai': patch
'@tanstack/ai-openrouter': patch
---

Unify the summarize subsystem on a shared chat-stream wrapper, plumb `modelOptions` through end-to-end, and tighten the `TProviderOptions` generic.

**Provider summarize adapters now share one implementation.** Anthropic, Gemini, Ollama, and OpenRouter previously each shipped a bespoke 200–300 LOC summarize adapter that re-implemented streaming, error handling, usage accounting, and chunk assembly on top of their text adapter. They now construct a `ChatStreamSummarizeAdapter` (formerly `ChatStreamWrapperAdapter`, renamed and exported from `@tanstack/ai/activities`) wrapping their own text adapter, matching the existing OpenAI/Grok pattern. Removes ~600 LOC of duplicated logic across the six providers and ensures behavioural parity.

**`SummarizationOptions.modelOptions` now reaches the wire.** Previously the activity layer (`runSummarize` / `runStreamingSummarize`) silently dropped `modelOptions` when building the internal `SummarizationOptions` it forwarded to the adapter. Provider-specific knobs (Anthropic cache control, OpenRouter plugins, Gemini safety settings, Groq tuning params, …) now flow through correctly.

**Provider summarize types resolve from the wrapped text adapter.** Each provider previously shipped a bespoke `XSummarizeProviderOptions` interface (a partial copy of its text provider options). Those interfaces are removed; summarize provider options are now inferred from the text adapter's `~types` via the new `InferTextProviderOptions<TAdapter>` helper exported from `@tanstack/ai/activities`. IntelliSense for `modelOptions` on `summarize({ adapter: openai('gpt-4o'), … })` now matches what `chat({ adapter: openai('gpt-4o'), … })` would show.

**`SummarizeAdapter` interface methods are now generic in `TProviderOptions`.** `summarize` and `summarizeStream` previously took `SummarizationOptions` (defaulted, so `modelOptions` was effectively `Record<string, any>` regardless of the adapter's typed shape). They now take `SummarizationOptions<TProviderOptions>`, threading the class's `TProviderOptions` generic through. Source-compatible for callers that didn't specify the generic; type-tighter for implementers and downstream consumers.

**Default aligned across the summarize surface.** `SummarizationOptions`, `SummarizeAdapter`, `BaseSummarizeAdapter`, and `ChatStreamSummarizeAdapter` previously had a mixed `Record<string, any>` / `Record<string, unknown>` / `object` set of defaults for `TProviderOptions`. They now uniformly default to `Record<string, unknown>` so unparameterised consumers narrow before indexed access on `modelOptions`. The `extends object` constraint is unchanged β€” per-model typed interfaces (e.g. `OpenAIBaseOptions & OpenAIReasoningOptions & ...`) inferred via `InferTextProviderOptions<TAdapter>` continue to satisfy it without needing a string index signature. No public-surface signature change for callers that supply a concrete provider-options shape (every shipping adapter does).

Bespoke `*SummarizeProviderOptions` interfaces (e.g. `OpenAISummarizeProviderOptions`, `AnthropicSummarizeProviderOptions`, `GeminiSummarizeProviderOptions`, `OllamaSummarizeProviderOptions`, `OpenRouterSummarizeProviderOptions`) are removed from the provider packages' public exports. Consumers who imported them should switch to inferring the type from the adapter (`InferTextProviderOptions<typeof adapter>`) or remove the explicit annotation (it'll be inferred from the adapter argument).
63 changes: 49 additions & 14 deletions docs/adapters/openrouter.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,16 +35,17 @@ const stream = chat({
## Configuration

```typescript
import { createOpenRouter, type OpenRouterConfig } from "@tanstack/ai-openrouter";

const config: OpenRouterConfig = {
apiKey: process.env.OPENROUTER_API_KEY!,
baseURL: "https://openrouter.ai/api/v1", // Optional
httpReferer: "https://your-app.com", // Optional, for rankings
xTitle: "Your App Name", // Optional, for rankings
};

const adapter = createOpenRouter(config.apiKey, config);
import { createOpenRouterText } from "@tanstack/ai-openrouter";

const adapter = createOpenRouterText(
"openai/gpt-5",
process.env.OPENROUTER_API_KEY!,
{
serverURL: "https://openrouter.ai/api/v1", // Optional
httpReferer: "https://your-app.com", // Optional, for rankings
appTitle: "Your App Name", // Optional, for rankings
},
);
```

## Available Models
Expand Down Expand Up @@ -122,18 +123,52 @@ OpenRouter can automatically route requests to the best available provider:
```typescript
const stream = chat({
adapter: openRouterText("openrouter/auto"),
messages,
providerOptions: {
messages,
modelOptions: {
models: [
"openai/gpt-4o",
"anthropic/claude-3.5-sonnet",
"google/gemini-pro",
],
route: "fallback", // Use fallback if primary fails
},
});
```


## Chat Completions vs Responses (beta)

OpenRouter exposes two OpenAI-compatible wire formats, and the adapter
package ships one of each:

| Adapter | Endpoint | Status | When to use |
| -------------------------- | ------------------------- | -------- | ---------------------------------------------------------------------------- |
| `openRouterText` | `/v1/chat/completions` | Stable | Default for almost everything. Broadest model + tool support. |
| `openRouterResponsesText` | `/v1/responses` | Beta | OpenAI Responses-shaped request/response; richer multi-turn state on OpenAI-style models. |

Both adapters route to any underlying model OpenRouter supports
(`anthropic/...`, `google/...`, `meta-llama/...`, etc.) β€” the wire format
describes how your client talks to OpenRouter, not which provider answers.
`/v1/responses` is OpenAI's newer API surface; OpenRouter implements it so
clients that prefer that wire format can use it across the same 300+
model catalogue.

```typescript
import { chat } from "@tanstack/ai";
import { openRouterResponsesText } from "@tanstack/ai-openrouter";

const stream = chat({
adapter: openRouterResponsesText("anthropic/claude-sonnet-4.5"),
messages: [{ role: "user", content: "Hello!" }],
});
```

Caveats while the Responses adapter is in beta:

- Function tools are supported; OpenRouter's branded server-tools (web
search, file search, …) are not yet wired through this path β€” use
`openRouterText` if you need those.
- If in doubt, prefer `openRouterText`. The Chat Completions endpoint has
broader provider coverage and feature parity today.

## Next Steps

- [Getting Started](../getting-started/quick-start) - Learn the basics
Expand Down
Loading
Loading