You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Hi, there is no detail on how to use Nemo-Guardrails with Databricks serving endpoint in the docs. What should be the correct way to use the endpoints with a Databricks access token?
Describe the solution you'd like
I have an existing RAG chain and want to use Nemo-Guardrails to filter input. Models in my notebook are loaded with Databricks(endpoint_name="endpoint_name",max_tokens=,temperature=) and ChatDatabricks(endpoint_name="endpoint_name",max_tokens=,temperature=) ChatDatabricks . The models I use include llama3 and databricks dbrx serving endpoints.
from langchain_community.chat_models import ChatDatabricks
from langchain_community.llms import Databricks
I know it's possible to use RunnableRails or "guardrails | some_chain" (LangChain Integration) but I want the self check input step to be done before the retrieval step inside the chain. That means if self check input decides the request should be blocked (Yes), the chain should reply with a default answer without retrieving any context. So how can I load the LLM from "endpoint_name" to check input inside the chain?
Describe alternatives you've considered
Also, is something like the method below possible?
In config.yml:
@azuretime, thanks for opening this issue, are you trying to use databricks?
I assume you have already checked this guide, so to start let's change the engine to databricks.
Also, you cannot pass headers but need to set an environment variable, for the possible list of parameters you can consult this link or better their github repo.
Did you check the docs?
Is your feature request related to a problem? Please describe.
Hi, there is no detail on how to use Nemo-Guardrails with Databricks serving endpoint in the docs. What should be the correct way to use the endpoints with a Databricks access token?
Describe the solution you'd like
I have an existing RAG chain and want to use Nemo-Guardrails to filter input. Models in my notebook are loaded with Databricks(endpoint_name="endpoint_name",max_tokens=,temperature=) and ChatDatabricks(endpoint_name="endpoint_name",max_tokens=,temperature=) ChatDatabricks . The models I use include llama3 and databricks dbrx serving endpoints.
I know it's possible to use RunnableRails or "guardrails | some_chain" (LangChain Integration) but I want the self check input step to be done before the retrieval step inside the chain. That means if self check input decides the request should be blocked (Yes), the chain should reply with a default answer without retrieving any context. So how can I load the LLM from "endpoint_name" to check input inside the chain?
Describe alternatives you've considered
Also, is something like the method below possible?
In config.yml:
Additional context
No response
The text was updated successfully, but these errors were encountered: