anthropic_client¶
Anthropic ModelClient integration with OpenAI SDK compatibility.
Functions
|
Convert ChatCompletion.usage into our AdalFlowResponseUsage format. |
Classes
|
A component wrapper for Anthropic API using OpenAI SDK compatibility. |
- class AnthropicAPIClient(api_key: str | None = None, base_url: str = 'https://api.anthropic.com/v1/', non_streaming_chat_completion_parser: Callable[[ChatCompletion], Any] | None = None, streaming_chat_completion_parser: Callable[[AsyncStream], Any] | None = None)[source]¶
Bases:
ModelClient
A component wrapper for Anthropic API using OpenAI SDK compatibility.
This client leverages Anthropic’s OpenAI SDK compatibility layer to provide ChatCompletion-based responses while maintaining AdalFlow’s GeneratorOutput structure.
Features: - Uses OpenAI SDK with Anthropic’s compatibility endpoint - Supports both streaming and non-streaming calls - Handles ModelType.LLM and ModelType.LLM_REASONING - Converts ChatCompletion responses to Response API format for compatibility - Maintains backward compatibility with existing AdalFlow parsers
- Parameters:
api_key (Optional[str]) – Anthropic API key. Defaults to ANTHROPIC_API_KEY env var.
base_url (str) – Anthropic’s OpenAI compatibility endpoint.
non_streaming_chat_completion_parser (Callable) – Legacy parser for non-streaming ChatCompletion objects. Used for backward compatibility with existing code that depends on the original AnthropicAPI client’s parsing behavior.
streaming_chat_completion_parser (Callable) – Parser for streaming ChatCompletion responses. Handles conversion from ChatCompletion streams to Response API format.
Note
Requires ANTHROPIC_API_KEY environment variable or api_key parameter. Uses OpenAI SDK internally but calls Anthropic’s API via compatibility layer. The non_streaming_chat_completion_parser is provided for legacy compatibility only.
- parse_chat_completion(completion: ChatCompletion | Stream[ChatCompletionChunk] | AsyncStream[ChatCompletionChunk]) GeneratorOutput [source]¶
Parse ChatCompletion and convert to GeneratorOutput with Response API compatibility.
This method uses ChatCompletionToResponseConverter to transform ChatCompletion objects into text format compatible with existing Response API parsers.
- Parameters:
completion – ChatCompletion object or stream from OpenAI SDK
- Returns:
GeneratorOutput with converted raw_response text
- track_completion_usage(completion: ChatCompletion) CompletionUsage [source]¶
Track completion usage from ChatCompletion object.
- convert_inputs_to_api_kwargs(input: Any | None = None, model_kwargs: Dict = {}, model_type: ModelType = ModelType.UNDEFINED) Dict [source]¶
Convert AdalFlow inputs to OpenAI ChatCompletion API format.
Converts single input text to OpenAI messages format expected by chat.completions.create endpoint.
- Parameters:
input – Text input or messages array
model_kwargs – Additional model parameters
model_type – Type of model (LLM or LLM_REASONING)
- Returns:
API kwargs formatted for OpenAI chat.completions.create
- Return type:
Dict
convertible with original api of Anthropic’s Message API
- call(api_kwargs: Dict = {}, model_type: ModelType = ModelType.UNDEFINED)[source]¶
Synchronous call to Anthropic via OpenAI SDK compatibility.
Supports both LLM and LLM_REASONING model types with streaming and non-streaming.
- Parameters:
api_kwargs – API parameters for chat.completions.create
model_type – ModelType.LLM or ModelType.LLM_REASONING
- Returns:
ChatCompletion or Stream[ChatCompletionChunk] from Anthropic
- async acall(api_kwargs: Dict = {}, model_type: ModelType = ModelType.UNDEFINED)[source]¶
Asynchronous call to Anthropic via OpenAI SDK compatibility.
- Parameters:
api_kwargs – API parameters for chat.completions.create
model_type – ModelType.LLM or ModelType.LLM_REASONING
- Returns:
ChatCompletion or AsyncStream[ChatCompletionChunk] from Anthropic