anthropic_client¶
Anthropic ModelClient integration.
Functions
|
When we only need the content of the first message. |
|
Handle the streaming response. |
Classes
|
A component wrapper for the Anthropic API client. |
|
LLM Response |
- class AnthropicAPIClient(api_key: str | None = None, non_streaming_chat_completion_parser: Callable[[Completion], Any] = None, streaming_chat_completion_parser: Callable[[Completion], Any] = None)[source]¶
Bases:
ModelClient
A component wrapper for the Anthropic API client.
Visit https://docs.anthropic.com/en/docs/intro-to-claude for more api details.
Note:
As antropic API needs users to set max_tokens, we set up a default value of 512 for the max_tokens. You can override this value by passing the max_tokens in the model_kwargs.
Reference: - https://docs.anthropic.com/en/docs/about-claude/models - interlevad thinking:https://docs.anthropic.com/en/docs/build-with-claude/extended-thinking#interleaved-thinking
- parse_chat_completion(completion: Completion | Generator[MessageStreamManager, None, None]) GeneratorOutput [source]¶
Parse the completion, and put it into the raw_response.
- convert_inputs_to_api_kwargs(input: Any | None = None, model_kwargs: Dict = {}, model_type: ModelType = ModelType.UNDEFINED) dict [source]¶
Anthropic API messages separates the system and the user messages.
As we focus on one prompt, we have to use the user message as the input.