fix(provider): don't inject stream_options for user-configured providers#19000
Open
gypelayo wants to merge 3 commits intoanomalyco:devfrom
Open
fix(provider): don't inject stream_options for user-configured providers#19000gypelayo wants to merge 3 commits intoanomalyco:devfrom
gypelayo wants to merge 3 commits intoanomalyco:devfrom
Conversation
…ompatible providers
Local Ollama and similar local LLMs don't support stream_options: { include_usage: true }.
Previously this was unconditionally injected for all @ai-sdk/openai-compatible providers,
including user-configured ones (source: 'config'), causing requests to local Ollama to fail.
Only inject includeUsage for providers loaded from models.dev (source: env/api/custom),
where we know the cloud endpoint supports it. User-configured providers can still opt in
by explicitly setting options.includeUsage: true in their opencode.json.
Contributor
|
Thanks for updating your PR! It now meets our contributing guidelines. 👍 |
Some local models (e.g. qwen2.5-coder) return tool calls as plain JSON text content instead of structured tool_calls, causing opencode to emit a text-delta instead of a tool-call. Add a wrapStream middleware for source=config providers that detects this pattern and converts it into a proper tool-call stream part.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Issue for this PR
Closes #10402
Type of change
What does this PR do?
Two related bugs affect user-configured
@ai-sdk/openai-compatibleproviders (local Ollama, LM Studio, etc.):1.
stream_optionsinjection causes connection failureopencode unconditionally injects
stream_options: { include_usage: true }into every streaming request for openai-compatible providers. Ollama doesn't support this field and rejects the request. The fix skips the injection when the provider source is"config"(user-defined in opencode.json).2. Raw JSON tool calls are not executed
Some models (e.g. qwen2.5-coder) return tool calls as plain JSON text content instead of structured
tool_calls, so opencode emits a text response instead of executing the tool. The fix adds awrapStreammiddleware forsource: "config"providers that detects this pattern — afinish_reason: stopwith no tool-call parts and content that parses as{"name": "...", "arguments": {...}}matching a known tool — and converts it into a proper tool-call.Both fixes are scoped to
source: "config"providers only, so cloud providers are unaffected.How did you verify your code works?
stream_optionsis absent from the request body for a local provider. Confirmed failing before fix, passing after.tool-callpart. Confirmed failing before fix, passing after.Screenshots / recordings
No UI changes.
Checklist