diff --git a/docs/how-to/LLM-Connections.md b/docs/how-to/LLM-Connections.md index 4acdbb3e32..1f0eafd5eb 100644 --- a/docs/how-to/LLM-Connections.md +++ b/docs/how-to/LLM-Connections.md @@ -88,7 +88,7 @@ There are a couple of different ways you can use HuggingFace to host your LLM. ### Your own HuggingFace endpoint ```python -from langchain_huggingface import HuggingFaceEndpoint, +from langchain_huggingface import HuggingFaceEndpoint llm = HuggingFaceEndpoint( repo_id="microsoft/Phi-3-mini-4k-instruct", @@ -194,4 +194,4 @@ azure_agent = Agent( ``` ## Conclusion -Integrating CrewAI with different LLMs expands the framework's versatility, allowing for customized, efficient AI solutions across various domains and platforms. \ No newline at end of file +Integrating CrewAI with different LLMs expands the framework's versatility, allowing for customized, efficient AI solutions across various domains and platforms.