This folder contains examples for direct chat client usage patterns.
| File | Description |
|---|---|
built_in_chat_clients.py |
Consolidated sample for built-in chat clients. Uses get_client() to create the selected client and pass it to main(). |
chat_response_cancellation.py |
Demonstrates how to cancel chat responses during streaming, showing proper cancellation handling and cleanup. |
custom_chat_client.py |
Demonstrates how to create custom chat clients by extending the BaseChatClient class. Shows a EchoingChatClient implementation and how to integrate it with Agent using the as_agent() method. |
built_in_chat_clients.py starts with:
asyncio.run(main("openai_chat"))Change the argument to pick a client:
openai_chatopenai_responsesopenai_assistantsanthropicollamabedrockazure_openai_chatazure_openai_responsesazure_openai_responses_foundryazure_openai_assistantsazure_ai_agent
Example:
uv run samples/02-agents/chat_client/built_in_chat_clients.pyDepending on the selected client, set the appropriate environment variables:
For Azure clients:
AZURE_OPENAI_ENDPOINT: Your Azure OpenAI endpointAZURE_OPENAI_CHAT_DEPLOYMENT_NAME: The name of your Azure OpenAI chat deploymentAZURE_OPENAI_RESPONSES_DEPLOYMENT_NAME: The name of your Azure OpenAI responses deployment
For Azure OpenAI Foundry responses client (azure_openai_responses_foundry):
AZURE_AI_PROJECT_ENDPOINT: Your Azure AI project endpointAZURE_OPENAI_RESPONSES_DEPLOYMENT_NAME: The name of your Azure OpenAI responses deployment
For Azure AI agent client (azure_ai_agent):
AZURE_AI_PROJECT_ENDPOINT: Your Azure AI project endpointAZURE_AI_MODEL_DEPLOYMENT_NAME: The name of your model deployment (used byazure_ai_agent)
For OpenAI clients:
OPENAI_API_KEY: Your OpenAI API keyOPENAI_CHAT_MODEL_ID: The OpenAI model foropenai_chatandopenai_assistantsOPENAI_RESPONSES_MODEL_ID: The OpenAI model foropenai_responses
For Anthropic client (anthropic):
ANTHROPIC_API_KEY: Your Anthropic API keyANTHROPIC_CHAT_MODEL_ID: The Anthropic model ID (for example,claude-sonnet-4-5)
For Ollama client (ollama):
OLLAMA_HOST: Ollama server URL (defaults tohttp://localhost:11434if unset)OLLAMA_MODEL_ID: Ollama model name (for example,mistral,qwen2.5:8b)
For Bedrock client (bedrock):
BEDROCK_CHAT_MODEL_ID: Bedrock model ID (for example,anthropic.claude-3-5-sonnet-20240620-v1:0)BEDROCK_REGION: AWS region (defaults tous-east-1if unset)- AWS credentials via standard environment variables (for example,
AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY)