grad_component¶
Base class for Autograd Components that can be called and backpropagated through.
Functions
|
Return a decorator that, when applied to a function fun, wraps it in a GradComponent with the given desc. |
Classes
|
|
|
A base class to define interfaces for an auto-grad component/operator. |
- class GradComponent(desc: str, name: str | None = None, backward_engine: BackwardEngine | None = None, model_client: ModelClient = None, model_kwargs: Dict[str, object] = None)[source]¶
Bases:
Component
A base class to define interfaces for an auto-grad component/operator.
Compared with Component, GradComponent defines three important interfaces: - forward: the forward pass of the function, returns a Parameter object that can be traced and backpropagated. - backward: the backward pass of the function, updates the gradients/prediction score backpropagated from a “loss” parameter. - set_backward_engine: set the backward engine(a form of generator) to the component, which is used to backpropagate the gradients using LLM.
The __call__ method will check if the component is in training mode, and call the forward method to return a Parameter object if it is in training mode, otherwise, it will call the call method to return the output such as “GeneratorOutput”, “RetrieverOutput”, etc.
Note: Avoid using the attributes and methods that are defined here and in the Component class unless you are overriding them.
- id = None¶
- backward_engine: BackwardEngine¶
- set_backward_engine(backward_engine: BackwardEngine = None, model_client: ModelClient = None, model_kwargs: Dict[str, object] = None)[source]¶
- disable_backward_engine()[source]¶
Does not run gradients generation, but still with backward to gain module-context
- call(*args, **kwargs)[source]¶
User must override this for the inference scenario if bicall is not defined.
- forward(*args, **kwargs) Parameter [source]¶
Default forward method for training: 1. for all args and kwargs, if it is a Parameter object, it will be tracked as Predecessor. 2. Trace input_args and full_response in the parameter object. 3. Return the parameter object.
- backward_with_pass_through_gradients(*, response: Parameter, id: str = None, **kwargs)[source]¶
Pass-through gradient to the predecessors.
Backward pass of the function. In default, it will pass all the scores to the predecessors.
Note: backward is mainly used internally and better to only allow kwargs as the input.
Subclass should implement this method if you need additional backward logic.
- backward(*, response: OutputParameter, id: str = None, disable_backward_engine=False, **kwargs)[source]¶
Backward pass of the function. In default, it will pass all the scores to the predecessors.
Note: backward is mainly used internally and better to only allow kwargs as the input.
Subclass should implement this method if you need additional backward logic.
- class FunGradComponent(fun: Callable | None = None, afun: Callable | None = None, desc: str = '', doc_string=None)[source]¶
Bases:
GradComponent
- fun_to_grad_component(desc: str = '', doc_string=None) Callable [source]¶
Return a decorator that, when applied to a function fun, wraps it in a GradComponent with the given desc.
Examples:
As a decorator:
::code-block :: python
- @fun_to_grad_component(desc=”This is a test function”, doc_string=Parameter(
data=”Finish the task with verbatim short factoid responses from retrieved context.”, param_type=ParameterType.PROMPT, requires_opt=True, role_desc=”Instruct how the agent creates the final answer from the step history.”,
)) def my_function(x):
return x + 1
print(my_function(1))
As a function:
::code-block :: python
- def my_function(x):
return x + 1
my_function_component = fun_to_grad_component(desc=”This is a test function”)(my_function)