Integration with Ollama for local LLM inference.
OllamaChatClient- Chat client for Ollama modelsOllamaChatOptions- Options TypedDict for Ollama-specific parametersOllamaSettings- Pydantic settings for Ollama configuration
from agent_framework.ollama import OllamaChatClient
client = OllamaChatClient(model_id="llama3.2")
response = await client.get_response("Hello")from agent_framework.ollama import OllamaChatClient
# or directly:
from agent_framework_ollama import OllamaChatClient