The foundation package containing all core abstractions, types, and built-in OpenAI/Azure OpenAI support.
agent_framework/
├── __init__.py # Public API exports
├── _agents.py # Agent implementations
├── _clients.py # Chat client base classes and protocols
├── _types.py # Core types (Message, ChatResponse, Content, etc.)
├── _tools.py # Tool definitions and function invocation
├── _middleware.py # Middleware system for request/response interception
├── _sessions.py # AgentSession and context provider abstractions
├── _mcp.py # Model Context Protocol support
├── _workflows/ # Workflow orchestration (sequential, concurrent, handoff, etc.)
├── openai/ # Built-in OpenAI client
├── azure/ # Lazy-loading entry point for Azure integrations
└── <provider>/ # Other lazy-loading provider folders
SupportsAgentRun- Protocol defining the agent interfaceBaseAgent- Abstract base class for agentsAgent- Main agent class wrapping a chat client with tools, instructions, and middleware
SupportsChatGetResponse- Protocol for chat client implementationsBaseChatClient- Abstract base class with middleware support; subclasses implement_inner_get_response()and_inner_get_streaming_response()
Message- Represents a chat message with role, content, and metadataChatResponse- Response from a chat client containing messages and usageChatResponseUpdate- Streaming response updateAgentResponse/AgentResponseUpdate- Agent-level response wrappersContent- Base class for message content (text, function calls, images, etc.)ChatOptions- TypedDict for chat request options
ToolProtocol- Protocol for tool definitionsFunctionTool- Wraps Python functions as tools with JSON schema generation@tooldecorator - Converts functions to toolsuse_function_invocation()- Decorator to add automatic function calling to chat clients
AgentMiddleware- Intercepts agentrun()callsChatMiddleware- Intercepts chat clientget_response()callsFunctionMiddleware- Intercepts function/tool invocationsAgentContext/ChatContext/FunctionInvocationContext- Context objects passed through middleware
AgentSession- Manages conversation state and session metadataSessionContext- Context object for session-scoped data during agent runsBaseContextProvider- Base class for context providers (RAG, memory systems)BaseHistoryProvider- Base class for conversation history storage
Workflow- Graph-based workflow definitionWorkflowBuilder- Fluent API for building workflows- Orchestrators:
SequentialOrchestrator,ConcurrentOrchestrator,GroupChatOrchestrator,MagenticOrchestrator,HandoffOrchestrator
OpenAIChatClient- Chat client for OpenAI APIOpenAIResponsesClient- Client for OpenAI Responses API
AzureOpenAIChatClient- Chat client for Azure OpenAIAzureOpenAIResponsesClient- Client for Azure OpenAI Responses API
from agent_framework import Agent
from agent_framework.openai import OpenAIChatClient
agent = Agent(
client=OpenAIChatClient(),
instructions="You are helpful.",
tools=[my_function],
)
response = await agent.run("Hello")agent = OpenAIChatClient().as_agent(
name="Assistant",
instructions="You are helpful.",
)from agent_framework import Agent, AgentMiddleware, AgentContext
class LoggingMiddleware(AgentMiddleware):
async def process(self, context: AgentContext, call_next) -> None:
print(f"Input: {context.messages}")
await call_next()
print(f"Output: {context.result}")
agent = Agent(..., middleware=[LoggingMiddleware()])from agent_framework import BaseChatClient, ChatResponse, Message
class MyClient(BaseChatClient):
async def _inner_get_response(self, *, messages, options, **kwargs) -> ChatResponse:
# Call your LLM here
return ChatResponse(messages=[Message(role="assistant", text="Hi!")])
async def _inner_get_streaming_response(self, *, messages, options, **kwargs):
yield ChatResponseUpdate(...)