Install AdalFlow and Run your LMΒΆ

ℹ️ Getting Started: Install AdalFlow and set up your LM
pip install -U adalflow

Setup `OPENAI_API_KEY` in your `.env` file or pass the `api_key` to the client.


import adalflow as adal

# setup env or pass the api_key to client
from adalflow.utils import setup_env

setup_env()

openai_llm = adal.Generator(
   model_client=adal.OpenAIClient(), model_kwargs={"model": "gpt-3.5-turbo"}
)
resopnse = openai_llm(prompt_kwargs={"input_str": "What is LLM?"})
             

Setup `GROQ_API_KEY` in your `.env` file or pass the `api_key` to the client.


import adalflow as adal

# setup env or pass the api_key to client
from adalflow.utils import setup_env

setup_env()

llama_llm = adal.Generator(
   model_client=adal.GroqAPIClient(), model_kwargs={"model": "llama3-8b-8192"}
)
resopnse = llama_llm(prompt_kwargs={"input_str": "What is LLM?"})


             

Setup `ANTHROPIC_API_KEY` in your `.env` file or pass the `api_key` to the client.


import adalflow as adal

# setup env or pass the api_key to client
from adalflow.utils import setup_env

setup_env()

anthropic_llm = adal.Generator(
   model_client=adal.AnthropicAPIClient(), model_kwargs={"model": "claude-3-opus-20240229"}
)
resopnse = anthropic_llm(prompt_kwargs={"input_str": "What is LLM?"})

             

Ollama is one option. You can also use `vllm` or HuggingFace `transformers`.


# Download Ollama command line tool
curl -fsSL https://ollama.com/install.sh | sh

# Pull the model to use
ollama pull llama3
             

Use it in the same way as other providers.


import adalflow as adal

llama_llm = adal.Generator(
   model_client=adal.OllamaClient(), model_kwargs={"model": "llama3"}
)
resopnse = llama_llm(prompt_kwargs={"input_str": "What is LLM?"})
             

For other providers, check the official documentation.