optimizer#
Base Classes for AdalFlow Optimizers, including Optimizer, TextOptimizer, and DemoOptimizer.
Classes
|
Base class for all demo optimizers. |
Base class for all optimizers. |
|
|
Base class for all text optimizers. |
- class Optimizer[source]#
Bases:
object
Base class for all optimizers.
- proposing: bool = False#
- params: Iterable[Parameter] | Iterable[Dict[str, Any]]#
- class TextOptimizer(*args, **kwargs)[source]#
Bases:
Optimizer
Base class for all text optimizers.
Text optimizer is via textual gradient descent, which is a variant of gradient descent that optimizes the text directly. It will generate new values for a given text prompt.This includes: - System prompt - output format - prompt template
- class DemoOptimizer(weighted: bool = True, dataset: Sequence[DataClass] = None, exclude_input_fields_from_bootstrap_demos: bool = False, *args, **kwargs)[source]#
Bases:
Optimizer
Base class for all demo optimizers.
Demo optimizer are few-shot optimization, where it will sample raw examples from train dataset or bootstrap examples from the model’s output. It will work with a sampler to generate new values for a given text prompt.
If bootstrap is used, it will require a teacher genearator to generate the examples.
- dataset: Sequence[DataClass]#
- exclude_input_fields_from_bootstrap_demos: bool = False#
- params: Iterable[Parameter] | Iterable[Dict[str, Any]]#