Skip to content

feat: dual ESM+CJS builds + toJSONResponse/fetchJSON for non-streaming runtimes#478

Open
AlemTuzlak wants to merge 7 commits intomainfrom
fix/cjs-output-and-json-response
Open

feat: dual ESM+CJS builds + toJSONResponse/fetchJSON for non-streaming runtimes#478
AlemTuzlak wants to merge 7 commits intomainfrom
fix/cjs-output-and-json-response

Conversation

@AlemTuzlak
Copy link
Copy Markdown
Contributor

@AlemTuzlak AlemTuzlak commented Apr 20, 2026

Summary

Two related fixes for Expo / Metro / non-streaming runtimes.

#308 — dual ESM + CJS output

`@tanstack/ai`, `@tanstack/ai-client`, and `@tanstack/ai-event-client` were ESM-only (`import` condition only, no `require` / `default`). Metro can't resolve that configuration, even with `unstable_enablePackageExports: true`, so consumers saw `Cannot resolve @tanstack/ai/adapters` etc.

Changes:

  • `vite.config.ts` for all three packages flipped to `cjs: true` — emits `dist/esm//*.js` + `.d.ts` and `dist/cjs//*.cjs` + `.d.cts`.
  • `package.json` `exports` for `.`, `./adapters`, `./middlewares` now use the nested `import`/`require` shape with type-aware conditions.
  • Added `main` pointing at the CJS entry so legacy resolvers still work.
  • `publint --strict` passes.

#309 — `toJSONResponse` + `fetchJSON`

Expo's `@expo/server` can't emit `ReadableStream` responses, so `toServerSentEventsResponse` / `toHttpResponse` crash with `Cannot read properties of undefined (reading 'statusText')`.

Changes:

  • `@tanstack/ai`: `toJSONResponse(stream, init?)` — drains the stream, returns `new Response(JSON.stringify(chunks), { Content-Type: application/json })`. Honours caller-provided headers / status; aborts a supplied `abortController` if the upstream throws.
  • `@tanstack/ai-client`: `fetchJSON(url, options?)` — matching connection adapter. POSTs `{ messages, data }`, parses the response as a `StreamChunk[]`, yields each chunk into the normal `ChatClient` pipeline.

Trade-off: you lose incremental rendering — the UI sees everything at once when the request resolves. Docs in both JSDoc blocks call this out and tell users to prefer SSE / HTTP-stream when the runtime supports them.

Test plan

  • Unit tests for `toJSONResponse`: defaults, header passthrough, custom `Content-Type`, abort-on-upstream-error
  • `pnpm test` across all 41 projects
  • `test:build` (publint strict) green on all three packages — dual exports map validated

Summary by CodeRabbit

  • New Features

    • Dual ESM + CommonJS distributions for packages to improve CJS compatibility
    • Server helper to return a chat stream as a single JSON-array Response
    • Client adapter to fetch JSON-array chat responses and replay them into the chat pipeline
  • Documentation

    • Guides for non-streaming runtimes (React Native / Expo) and JSON-adapter usage
  • Tests

    • Added tests covering JSON adapter behavior, headers, aborts, and error cases

Fixes #308 and #309.

- @tanstack/ai, @tanstack/ai-client, @tanstack/ai-event-client now emit
  both dist/esm/*.js and dist/cjs/*.cjs with matching .d.cts files.
  package.json exports gained nested import/require conditions plus a
  `main` field so Metro / Expo / other CJS-only resolvers can find
  the subpath exports (`./adapters`, `./middlewares`, etc.).

- New toJSONResponse(stream, init?) on @tanstack/ai: drains the stream
  and returns a JSON-array Response. For runtimes that can't stream
  ReadableStream bodies (Expo's @expo/server, edge proxies).

- New fetchJSON(url, options?) connection adapter on @tanstack/ai-client:
  the client-side counterpart — fetches the JSON array and replays each
  chunk into the normal ChatClient pipeline.

- Trade-off documented in both: you lose incremental rendering; use
  SSE / HTTP-stream responses when the runtime supports them.
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Apr 20, 2026

🚀 Changeset Version Preview

12 package(s) bumped directly, 21 bumped as dependents.

🟥 Major bumps

Package Version Reason
@tanstack/ai-event-client 0.2.7 → 1.0.0 Changeset
@tanstack/ai-fal 0.6.17 → 1.0.0 Changeset
@tanstack/ai-gemini 0.9.1 → 1.0.0 Changeset
@tanstack/ai-grok 0.6.8 → 1.0.0 Changeset
@tanstack/ai-openai 0.8.1 → 1.0.0 Changeset
@tanstack/ai-openrouter 0.8.1 → 1.0.0 Changeset
@tanstack/ai-react 0.7.15 → 1.0.0 Changeset
@tanstack/ai-solid 0.6.19 → 1.0.0 Changeset
@tanstack/ai-svelte 0.6.19 → 1.0.0 Changeset
@tanstack/ai-vue 0.6.19 → 1.0.0 Changeset
@tanstack/ai-anthropic 0.8.1 → 1.0.0 Dependent
@tanstack/ai-code-mode 0.1.7 → 1.0.0 Dependent
@tanstack/ai-code-mode-skills 0.1.7 → 1.0.0 Dependent
@tanstack/ai-elevenlabs 0.1.7 → 1.0.0 Dependent
@tanstack/ai-groq 0.1.7 → 1.0.0 Dependent
@tanstack/ai-isolate-node 0.1.7 → 1.0.0 Dependent
@tanstack/ai-isolate-quickjs 0.1.7 → 1.0.0 Dependent
@tanstack/ai-ollama 0.6.9 → 1.0.0 Dependent
@tanstack/ai-preact 0.6.19 → 1.0.0 Dependent
@tanstack/ai-react-ui 0.6.1 → 1.0.0 Dependent
@tanstack/ai-solid-ui 0.6.1 → 1.0.0 Dependent

🟨 Minor bumps

Package Version Reason
@tanstack/ai 0.13.0 → 0.14.0 Changeset
@tanstack/ai-client 0.7.14 → 0.8.0 Changeset

🟩 Patch bumps

Package Version Reason
@tanstack/ai-code-mode-models-eval 0.0.10 → 0.0.11 Dependent
@tanstack/ai-devtools-core 0.3.24 → 0.3.25 Dependent
@tanstack/ai-isolate-cloudflare 0.1.7 → 0.1.8 Dependent
@tanstack/ai-vue-ui 0.1.30 → 0.1.31 Dependent
@tanstack/preact-ai-devtools 0.1.28 → 0.1.29 Dependent
@tanstack/react-ai-devtools 0.2.28 → 0.2.29 Dependent
@tanstack/solid-ai-devtools 0.2.28 → 0.2.29 Dependent
ts-svelte-chat 0.1.36 → 0.1.37 Dependent
ts-vue-chat 0.1.36 → 0.1.37 Dependent
vanilla-chat 0.0.34 → 0.0.35 Dependent

@nx-cloud
Copy link
Copy Markdown

nx-cloud Bot commented Apr 20, 2026

View your CI Pipeline Execution ↗ for commit 43e59bb

Command Status Duration Result
nx run-many --targets=build --exclude=examples/** ✅ Succeeded 1m 45s View ↗

☁️ Nx Cloud last updated this comment at 2026-04-24 09:58:21 UTC

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Apr 20, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: e75a6aae-bf60-42ff-9f01-6b9fb43b2829

📥 Commits

Reviewing files that changed from the base of the PR and between 3fa270d and 43e59bb.

📒 Files selected for processing (6)
  • docs/config.json
  • packages/typescript/ai-client/src/index.ts
  • packages/typescript/ai-client/tests/connection-adapters.test.ts
  • packages/typescript/ai/src/index.ts
  • packages/typescript/ai/src/stream-to-response.ts
  • packages/typescript/ai/tests/stream-to-response.test.ts
✅ Files skipped from review due to trivial changes (1)
  • docs/config.json
🚧 Files skipped from review as they are similar to previous changes (2)
  • packages/typescript/ai/src/index.ts
  • packages/typescript/ai/src/stream-to-response.ts

📝 Walkthrough

Walkthrough

Adds dual ESM/CJS package entrypoints for TanStack AI packages and introduces two APIs: server-side toJSONResponse(stream, init?) (drain stream → single JSON Response) and client-side fetchJSON(url, options?) (POST → JSON array → replayed StreamChunks).

Changes

Cohort / File(s) Summary
Changeset Doc
.changeset/cjs-output-and-json-response.md
Release notes describing dual ESM/CJS packaging and the new toJSONResponse() / fetchJSON() capabilities.
Package Config (ESM/CJS exports)
packages/typescript/ai/package.json, packages/typescript/ai-client/package.json, packages/typescript/ai-event-client/package.json
Added top-level main pointing to CJS builds and updated exports to conditional "import" (ESM) and "require" (CJS) targets with corresponding type files.
Build Config
packages/typescript/ai/vite.config.ts, packages/typescript/ai-client/vite.config.ts, packages/typescript/ai-event-client/vite.config.ts
Enabled CJS output by setting cjs: true in Vite/TanStack build config.
Stream-to-Response
packages/typescript/ai/src/stream-to-response.ts, packages/typescript/ai/src/index.ts
Added toJSONResponse(stream, init?): fully consumes an async stream into an array and returns a single Response with JSON body; preserves init fields, handles aborts/errors; re-exported at package top-level.
Connection Adapter
packages/typescript/ai-client/src/connection-adapters.ts, packages/typescript/ai-client/src/index.ts
Added fetchJSON(url, options?) adapter: performs one POST, parses JSON array, validates array shape, yields elements as StreamChunk; re-exported at package top-level.
Tests
packages/typescript/ai/tests/stream-to-response.test.ts, packages/typescript/ai-client/tests/connection-adapters.test.ts
New tests: toJSONResponse behavior and abort/error handling; fetchJSON behavior, error cases, option resolution, body merging, custom fetch client and signal forwarding.
Docs & Nav
docs/api/ai-client.md, docs/api/ai.md, docs/chat/connection-adapters.md, docs/chat/non-streaming-runtimes.md, docs/chat/streaming.md, docs/config.json
Documentation and navigation additions for fetchJSON/toJSONResponse, non-streaming runtimes guide, examples, and links.
Misc Packaging
packages/typescript/ai-client/vite.config.ts, packages/typescript/ai-event-client/vite.config.ts
Minor build config toggles to produce CJS artifacts consistent with package.json changes.

Sequence Diagram(s)

sequenceDiagram
  participant Client as Client
  participant ChatClient as ChatClient
  participant Server as Server
  participant StreamProducer as StreamProducer

  Client->>ChatClient: call fetchJSON(url, options)
  ChatClient->>Server: POST { messages, data }
  Server->>StreamProducer: start async chat stream
  StreamProducer-->>Server: yield StreamChunk...
  Server->>Server: toJSONResponse(stream) drains all chunks -> JSON array
  Server-->>ChatClient: HTTP 200 body: [StreamChunk, ...]
  ChatClient->>ChatClient: parse array, replay chunks into pipeline
  ChatClient->>Client: deliver reconstructed events (non-incremental)
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Poem

🐰 I nibbled bytes and gathered the stream,
Piled every chunk into one bright dream.
CJS and ESM now hop side by side,
JSON returns where streams can't ride.
Fetch, replay — a tidy, cozy scheme.

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Title check ✅ Passed The PR title clearly and specifically describes the main changes: dual ESM+CJS builds and new toJSONResponse/fetchJSON functionality for non-streaming runtimes.
Description check ✅ Passed The PR description is comprehensive and follows the template structure with clear sections on changes, testing, and release impact with a changeset generated.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch fix/cjs-output-and-json-response

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@pkg-pr-new
Copy link
Copy Markdown

pkg-pr-new Bot commented Apr 20, 2026

Open in StackBlitz

@tanstack/ai

npm i https://pkg.pr.new/@tanstack/ai@478

@tanstack/ai-anthropic

npm i https://pkg.pr.new/@tanstack/ai-anthropic@478

@tanstack/ai-client

npm i https://pkg.pr.new/@tanstack/ai-client@478

@tanstack/ai-code-mode

npm i https://pkg.pr.new/@tanstack/ai-code-mode@478

@tanstack/ai-code-mode-skills

npm i https://pkg.pr.new/@tanstack/ai-code-mode-skills@478

@tanstack/ai-devtools-core

npm i https://pkg.pr.new/@tanstack/ai-devtools-core@478

@tanstack/ai-elevenlabs

npm i https://pkg.pr.new/@tanstack/ai-elevenlabs@478

@tanstack/ai-event-client

npm i https://pkg.pr.new/@tanstack/ai-event-client@478

@tanstack/ai-fal

npm i https://pkg.pr.new/@tanstack/ai-fal@478

@tanstack/ai-gemini

npm i https://pkg.pr.new/@tanstack/ai-gemini@478

@tanstack/ai-grok

npm i https://pkg.pr.new/@tanstack/ai-grok@478

@tanstack/ai-groq

npm i https://pkg.pr.new/@tanstack/ai-groq@478

@tanstack/ai-isolate-cloudflare

npm i https://pkg.pr.new/@tanstack/ai-isolate-cloudflare@478

@tanstack/ai-isolate-node

npm i https://pkg.pr.new/@tanstack/ai-isolate-node@478

@tanstack/ai-isolate-quickjs

npm i https://pkg.pr.new/@tanstack/ai-isolate-quickjs@478

@tanstack/ai-ollama

npm i https://pkg.pr.new/@tanstack/ai-ollama@478

@tanstack/ai-openai

npm i https://pkg.pr.new/@tanstack/ai-openai@478

@tanstack/ai-openrouter

npm i https://pkg.pr.new/@tanstack/ai-openrouter@478

@tanstack/ai-preact

npm i https://pkg.pr.new/@tanstack/ai-preact@478

@tanstack/ai-react

npm i https://pkg.pr.new/@tanstack/ai-react@478

@tanstack/ai-react-ui

npm i https://pkg.pr.new/@tanstack/ai-react-ui@478

@tanstack/ai-solid

npm i https://pkg.pr.new/@tanstack/ai-solid@478

@tanstack/ai-solid-ui

npm i https://pkg.pr.new/@tanstack/ai-solid-ui@478

@tanstack/ai-svelte

npm i https://pkg.pr.new/@tanstack/ai-svelte@478

@tanstack/ai-vue

npm i https://pkg.pr.new/@tanstack/ai-vue@478

@tanstack/ai-vue-ui

npm i https://pkg.pr.new/@tanstack/ai-vue-ui@478

@tanstack/preact-ai-devtools

npm i https://pkg.pr.new/@tanstack/preact-ai-devtools@478

@tanstack/react-ai-devtools

npm i https://pkg.pr.new/@tanstack/react-ai-devtools@478

@tanstack/solid-ai-devtools

npm i https://pkg.pr.new/@tanstack/solid-ai-devtools@478

commit: 43e59bb

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (3)
packages/typescript/ai-client/src/connection-adapters.ts (1)

495-497: Optional: honor abortSignal while replaying chunks.

Since the whole payload is already in memory, the loop ignores abortSignal during replay. If the consumer aborts late (e.g., user navigates away before chunks are drained by the pipeline), chunks will continue to flow. Consider a cheap check to bail out early:

♻️ Suggested tweak
       for (const chunk of payload) {
+        if (abortSignal?.aborted) break
         yield chunk as StreamChunk
       }

Not a blocker given the buffered/non-streaming nature of this adapter.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai-client/src/connection-adapters.ts` around lines 495 -
497, The replay loop that yields chunks from the in-memory payload currently
ignores abortSignal and continues pushing chunks even after cancellation; inside
the loop that iterates over payload and yields each item (the block yielding
chunk as StreamChunk), check the provided abortSignal (e.g.,
abortSignal?.aborted) on each iteration and bail out immediately (return/stop
iteration) when aborted so the generator stops producing further StreamChunk
values.
packages/typescript/ai-client/package.json (1)

21-35: Dual exports look correct; consider exposing ./package.json.

The nested import/require conditions with type-aware types keys are the recommended Node resolution shape, and main.cjs pairs correctly with "type": "module" (Node uses the extension to disambiguate). publint strict passing is a good signal.

Optional: add "./package.json": "./package.json" to exports so tools that probe the manifest (some bundlers, version resolvers) don't get blocked by the closed export map. Same applies to packages/typescript/ai/package.json and packages/typescript/ai-event-client/package.json.

Proposed addition
   "exports": {
     ".": {
       "import": {
         "types": "./dist/esm/index.d.ts",
         "default": "./dist/esm/index.js"
       },
       "require": {
         "types": "./dist/cjs/index.d.cts",
         "default": "./dist/cjs/index.cjs"
       }
-    }
+    },
+    "./package.json": "./package.json"
   },
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai-client/package.json` around lines 21 - 35, Add an
explicit export entry for the package manifest so consumers and tooling can read
it: update the package.json "exports" object to include the key "./package.json"
mapping to "./package.json" (mirror this change in
packages/typescript/ai/package.json and
packages/typescript/ai-event-client/package.json as well); locate the "exports"
block that currently defines "." with "import"/"require" and add the
"./package.json": "./package.json" mapping alongside those entries.
packages/typescript/ai/tests/stream-to-response.test.ts (1)

875-945: LGTM — good coverage for toJSONResponse.

Tests cover the four meaningful branches (defaults, custom init/headers, explicit Content-Type passthrough, and abort-on-upstream-error with rethrow). Nice use of toHaveBeenCalledOnce() to assert abort happens exactly once.

One optional addition worth considering: a test that asserts the controller is not aborted when the stream drains successfully, to lock in that behavior against regressions.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai/tests/stream-to-response.test.ts` around lines 875 -
945, Add a test to ensure the provided AbortController is NOT aborted when the
stream drains successfully: create an AbortController, spy on its abort method
(vi.spyOn(abortController, 'abort')), call toJSONResponse with
createMockStream([...successful chunks...]) and the abortController in options,
await the response.json() (or response completion), then assert abortSpy was not
called (toHaveBeenCalledTimes(0) / not.toHaveBeenCalled()). Reference
toJSONResponse, createMockStream, and AbortController/abort in the test so
behavior is covered alongside the existing abort-on-error test.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@packages/typescript/ai-client/package.json`:
- Around line 21-35: Add an explicit export entry for the package manifest so
consumers and tooling can read it: update the package.json "exports" object to
include the key "./package.json" mapping to "./package.json" (mirror this change
in packages/typescript/ai/package.json and
packages/typescript/ai-event-client/package.json as well); locate the "exports"
block that currently defines "." with "import"/"require" and add the
"./package.json": "./package.json" mapping alongside those entries.

In `@packages/typescript/ai-client/src/connection-adapters.ts`:
- Around line 495-497: The replay loop that yields chunks from the in-memory
payload currently ignores abortSignal and continues pushing chunks even after
cancellation; inside the loop that iterates over payload and yields each item
(the block yielding chunk as StreamChunk), check the provided abortSignal (e.g.,
abortSignal?.aborted) on each iteration and bail out immediately (return/stop
iteration) when aborted so the generator stops producing further StreamChunk
values.

In `@packages/typescript/ai/tests/stream-to-response.test.ts`:
- Around line 875-945: Add a test to ensure the provided AbortController is NOT
aborted when the stream drains successfully: create an AbortController, spy on
its abort method (vi.spyOn(abortController, 'abort')), call toJSONResponse with
createMockStream([...successful chunks...]) and the abortController in options,
await the response.json() (or response completion), then assert abortSpy was not
called (toHaveBeenCalledTimes(0) / not.toHaveBeenCalled()). Reference
toJSONResponse, createMockStream, and AbortController/abort in the test so
behavior is covered alongside the existing abort-on-error test.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: d831b837-151e-456f-8d68-13b77d844f5a

📥 Commits

Reviewing files that changed from the base of the PR and between 1d6f3be and 1bbc932.

📒 Files selected for processing (12)
  • .changeset/cjs-output-and-json-response.md
  • packages/typescript/ai-client/package.json
  • packages/typescript/ai-client/src/connection-adapters.ts
  • packages/typescript/ai-client/src/index.ts
  • packages/typescript/ai-client/vite.config.ts
  • packages/typescript/ai-event-client/package.json
  • packages/typescript/ai-event-client/vite.config.ts
  • packages/typescript/ai/package.json
  • packages/typescript/ai/src/index.ts
  • packages/typescript/ai/src/stream-to-response.ts
  • packages/typescript/ai/tests/stream-to-response.test.ts
  • packages/typescript/ai/vite.config.ts

…rences

Serves three personas: Expo/RN builders hitting streaming-response crashes,
builders on other non-streaming runtimes (edge proxies, legacy serverless),
and evaluators checking whether TanStack AI supports RN/Expo.

- New journey page at docs/chat/non-streaming-runtimes.md titled 'React
  Native & Expo'. A → B: Expo API route crashing on streaming response →
  working chat via toJSONResponse + fetchJSON.
- Cross-linked from chat/streaming.md (callout near
  toServerSentEventsResponse) and chat/connection-adapters.md (new
  'JSON Array (non-streaming runtimes)' subsection).
- Added the new entries to the API references: toJSONResponse in
  docs/api/ai.md and fetchJSON in docs/api/ai-client.md, each pointing
  back to the walkthrough.
- Registered the new page in docs/config.json under 'Chat & Streaming',
  sequenced right after Connection Adapters.
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
docs/api/ai-client.md (1)

145-184: Consider varying sentence structure to improve readability.

Three connection adapter sections in succession begin with "Creates," making the documentation slightly repetitive. Consider varying the opening phrase for better flow.

✍️ Suggested rewording
 ### `fetchServerSentEvents(url, options?)`
 
-Creates an SSE connection adapter.
+Establishes an SSE connection adapter for server-sent events streaming.

or

 ### `fetchJSON(url, options?)`
 
-Creates a connection adapter for non-streaming runtimes — pair with [`toJSONResponse`](./ai#tojsonresponsestream-init) on the server.
+Provides a connection adapter for non-streaming runtimes — pair with [`toJSONResponse`](./ai#tojsonresponsestream-init) on the server.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/api/ai-client.md` around lines 145 - 184, The three adapter descriptions
(fetchServerSentEvents, fetchHttpStream, fetchJSON) all start with the same verb
"Creates," making the copy repetitive; update the lead sentence for one or two
of these functions to vary phrasing (e.g., "Opens an SSE connection adapter
for...", "Provides an HTTP stream adapter that...", or "Returns a JSON-based
adapter for non-streaming runtimes...") while keeping the technical details
intact (include options example and the note about POSTing { messages, data }
for fetchJSON and the trade-off about no incremental rendering), and ensure the
function names fetchServerSentEvents, fetchHttpStream, and fetchJSON remain
present so readers can locate the API.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@docs/api/ai-client.md`:
- Around line 145-184: The three adapter descriptions (fetchServerSentEvents,
fetchHttpStream, fetchJSON) all start with the same verb "Creates," making the
copy repetitive; update the lead sentence for one or two of these functions to
vary phrasing (e.g., "Opens an SSE connection adapter for...", "Provides an HTTP
stream adapter that...", or "Returns a JSON-based adapter for non-streaming
runtimes...") while keeping the technical details intact (include options
example and the note about POSTing { messages, data } for fetchJSON and the
trade-off about no incremental rendering), and ensure the function names
fetchServerSentEvents, fetchHttpStream, and fetchJSON remain present so readers
can locate the API.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: c65a6675-dcd6-4072-9b42-d6a27f6233ca

📥 Commits

Reviewing files that changed from the base of the PR and between 1bbc932 and ee3f393.

📒 Files selected for processing (6)
  • docs/api/ai-client.md
  • docs/api/ai.md
  • docs/chat/connection-adapters.md
  • docs/chat/non-streaming-runtimes.md
  • docs/chat/streaming.md
  • docs/config.json
✅ Files skipped from review due to trivial changes (4)
  • docs/config.json
  • docs/chat/non-streaming-runtimes.md
  • docs/chat/streaming.md
  • docs/chat/connection-adapters.md

AlemTuzlak and others added 4 commits April 23, 2026 14:02
…on-response

# Conflicts:
#	packages/typescript/ai/package.json
…JSON

Address CR findings:

- toJSONResponse now checks `abortController.signal.aborted` on entry
  (throws the signal's reason without draining the upstream) and inside
  the drain loop (breaks early if aborted mid-stream), matching the
  semantics of toServerSentEventsStream and toHttpStream. Previously the
  signal was only consulted from the error-path catch handler, so a
  pre-aborted controller drained the full stream anyway and a mid-drain
  abort was silently ignored.
- Add two new tests covering pre-abort (infinite stream never pulled)
  and mid-drain abort (bounded pulls after abort fires).
- Add 8 fetchJSON tests covering happy path, non-2xx, non-array body
  with descriptive error, url-as-function, options-as-async-function,
  options.body merging, custom fetchClient override, and AbortSignal
  propagation — the adapter previously had zero direct test coverage.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant