From e58f9ae1af7a7dcb6056f4aefd3d339ae7a3c10f Mon Sep 17 00:00:00 2001 From: Lucky Verma Date: Fri, 8 May 2026 09:20:28 +0000 Subject: [PATCH] chore(examples): bump module_client.py to gpt-5.5 The module-level client example still uses bare gpt-4. gpt-5.5 is the current default text and reasoning model per the openai-cookbook ("use gpt-5.5 for the strongest code review accuracy") and the openai-agents-python docs (voice/quickstart.md, reasoning_content example), and is documented as the latest default in openai/codex's latest-model.md. The model: parameter is typed Union[str, ChatModel] so this works even though Stainless has not yet synced gpt-5.5 into the ChatModel literal here. The example demonstrates module-level client configuration, the model choice is incidental. --- examples/module_client.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/examples/module_client.py b/examples/module_client.py index 5f2fb79dcf..ab33aa3b62 100755 --- a/examples/module_client.py +++ b/examples/module_client.py @@ -9,7 +9,7 @@ # all API calls work in the exact same fashion as well stream = openai.chat.completions.create( - model="gpt-4", + model="gpt-5.5", messages=[ { "role": "user",