From 09cba0135e875fc0a38e20dd2073149b6b9e335a Mon Sep 17 00:00:00 2001 From: Shu Huang <44169687+ShuHuang@users.noreply.github.com> Date: Thu, 22 Aug 2024 14:39:15 +0100 Subject: [PATCH] Bugfix: Update LLM-Connections.md The original code doesn't work due to a comma --- docs/how-to/LLM-Connections.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/how-to/LLM-Connections.md b/docs/how-to/LLM-Connections.md index 4acdbb3e32..1f0eafd5eb 100644 --- a/docs/how-to/LLM-Connections.md +++ b/docs/how-to/LLM-Connections.md @@ -88,7 +88,7 @@ There are a couple of different ways you can use HuggingFace to host your LLM. ### Your own HuggingFace endpoint ```python -from langchain_huggingface import HuggingFaceEndpoint, +from langchain_huggingface import HuggingFaceEndpoint llm = HuggingFaceEndpoint( repo_id="microsoft/Phi-3-mini-4k-instruct", @@ -194,4 +194,4 @@ azure_agent = Agent( ``` ## Conclusion -Integrating CrewAI with different LLMs expands the framework's versatility, allowing for customized, efficient AI solutions across various domains and platforms. \ No newline at end of file +Integrating CrewAI with different LLMs expands the framework's versatility, allowing for customized, efficient AI solutions across various domains and platforms.