Skip to content

How would I swap in the "ollama/llama2" model into the RestrictToTopic validator #1219

Answered by CalebCourier
widgetface asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @widgetface, the answer depends on how you intend to use ollama/llama2 within the validator. By default, this validator uses both a zero-shot classification model and an LLM model to perform an "ensemble" approach when extracting topics from the provided text.

Using Only ollama/llama2

If you want to use only ollama/llama2 to perform this function, you should disable the zero-shot model by passing disable_classifier=True and set the llm_callable to a Callable that takes the text to be validated and a list of topics, and returns a string with any topics found. Below is the default method that does this with an OpenAI model. Yours will likely look similar but with litellm and your chosen …

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by widgetface
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants