Adalflow Logo

Try Quickstart in Colab

⚡ The Library to Build and Auto-optimize Any LLM Task Pipeline ⚡

Embracing a design philosophy similar to PyTorch, AdalFlow is powerful, light, modular, and robust.

⚡ The Library to Build and Auto-optimize Any LLM Task Pipeline ⚡

Embracing a design philosophy similar to PyTorch, AdalFlow is powerful, light, modular, and robust.

Light, Modular, and Model-agnositc Task Pipeline

LLMs are like water; AdalFlow help developers quickly shape them into any applications, from GenAI applications such as chatbots, translation, summarization, code generation, RAG, and autonomous agents to classical NLP tasks like text classification and named entity recognition.

Only two fundamental but powerful base classes: Component for the pipeline and DataClass for data interaction with LLMs. The result is a library with bare minimum abstraction, providing developers with maximum customizability.

You have full control over the prompt template, the model you use, and the output parsing for your task pipeline.

AdalFlow Task Pipeline

Unified Framework for Auto-Optimization

AdalFlow provides token-efficient and high-performing prompt optimization within a unified framework. To optimize your pipeline, simply define a Parameter and pass it to our Generator. Whether you need to optimize task instructions or few-shot demonstrations, our unified framework offers an easy way to diagnose, visualize, debug, and train your pipeline.

This trace graph demonstrates how our auto-differentiation works: trace_graph

Trainable Task Pipeline

Just define it as a Parameter and pass it to our Generator.

AdalFlow Trainable Task Pipeline

AdalComponent & Trainer

AdalComponent acts as the interpreter between task pipeline and the trainer, defining training and validation steps, optimizers, evaluators, loss functions, backward engine for textual gradients or tracing the demonstrations, the teacher generator.

AdalFlow AdalComponent & Trainer

Unites Research and Production#

Our team has experience in both AI research and production. We are building a library that unites the two worlds, forming a healthy LLM application ecosystem.

  • To resemble the PyTorch library makes it easier for LLM researchers to use the library.

  • Researchers building on AdalFlow enable production engineers to easily adopt, test, and iterate on their production data.

  • Our 100% control and clarity of the source code further make it easy for product teams to build on and for researchers to extend their new methods.