Skip to content

fix: normalize leading question marks in exposed port queries#3424

Merged
seratch merged 1 commit into
openai:mainfrom
matthewflint:mflint/sandbox-normalize-port-query
May 16, 2026
Merged

fix: normalize leading question marks in exposed port queries#3424
seratch merged 1 commit into
openai:mainfrom
matthewflint:mflint/sandbox-normalize-port-query

Conversation

@matthewflint
Copy link
Copy Markdown
Contributor

Summary

  • Normalize ExposedPortEndpoint.query when it already starts with ?.
  • Preserve existing behavior for plain query strings and empty query strings.
  • Add a regression test for providers that pass a URL query component with the leading delimiter included.

Why

url_for() currently always formats ?{self.query}. That works when query="token=...", but produces ??token=... when a provider passes the query in URL-component form (query="?token=..."). Normalizing this in one place makes provider implementations less fragile.

Testing

  • uv run pytest tests/sandbox/test_exposed_ports.py
  • uv run ruff check src/agents/sandbox tests/sandbox
  • uv run pyright

Normalize exposed-port query values that already include a leading question mark so generated HTTP and WebSocket URLs do not contain a double question mark.
@matthewflint matthewflint force-pushed the mflint/sandbox-normalize-port-query branch from e2ab22e to 2d77938 Compare May 15, 2026 11:48
@seratch seratch added this to the 0.17.x milestone May 16, 2026
@seratch seratch merged commit e37b3d2 into openai:main May 16, 2026
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants