Skip to content

Commit

Permalink
Merge branch 'main' of github.com:datastax/astra-assistants-api
Browse files Browse the repository at this point in the history
  • Loading branch information
phact committed Jul 18, 2024
2 parents 917b9fe + 9ab94a3 commit b01fb9c
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 7 deletions.
4 changes: 1 addition & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -178,9 +178,7 @@ you need to pull the model you want to ollama before using it

curl http://localhost:11434/api/pull -d '{ "name": "deepseek-coder-v2" }'

your assistants client should route to the ollama container by passing the llm-param-base-url header:

client = patch(OpenAI(default_headers={"LLM-PARAM-base-url": "http://ollama:11434"}))
your assistants client should route to the ollama container setting OLLAMA_API_BASE_URL. OLLAMA_API_BASE_URL should be set to http://ollama:11434 if you are using docker-compose. If you are using ollama on your localhost you can set it to http://localhost:11434


## Feedback / Help
Expand Down
7 changes: 3 additions & 4 deletions examples/python/agency-swarm/local_open_source_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,8 @@
load_dotenv("./.env")
load_dotenv("../../../.env")

client = patch(OpenAI(default_headers={"LLM-PARAM-base-url": "http://localhost:11434"}))
# if using docker-compose, pass custom header to point to the ollama container instead of localhost
# client = patch(OpenAI(default_headers={"LLM-PARAM-base-url": "http://ollama:11434"}))
# remember to set OLLAMA_API_BASE_URL="http://ollama:11434" and base_url="http://localhost:8000/v1" in your env
client = patch(OpenAI())

set_openai_client(client)

Expand All @@ -25,4 +24,4 @@
print(assistant)

completion = agency.get_completion("What's something interesting about language models?")
print(completion)
print(completion)

0 comments on commit b01fb9c

Please sign in to comment.