LLM Wrapper to use
Key to use for output, defaults to text
Prompt object to use
Optional
llmKwargs to pass to LLM
Optional
memoryOptional
outputOutputParser to use
Optional
config: any[]Use .batch() instead. Will be removed in 0.2.0.
This feature is deprecated and will be removed in the future.
It is not recommended for use.
Call the chain on all inputs in the list
Static
deserializeLoad a chain from a json-like object describing it.
Static
fromLLMStatic method to create a new TaskPrioritizationChain from a BaseLanguageModel. It generates a prompt using the PromptTemplate class and the task prioritization template, and returns a new instance of TaskPrioritizationChain.
Object with fields used to initialize the chain, excluding the prompt.
A new instance of TaskPrioritizationChain.
Generated using TypeDoc
Chain to prioritize tasks.