Skip to content

Commit

Permalink
Fix image in docker compose yaml to use the built docker image tag fr…
Browse files Browse the repository at this point in the history
…om the README (#498)

* Update README.md

The `ray_serve:habana` doesn't exist (yet) in docker hub

* Fixed the image in docker compose yaml

Signed-off-by: Harsha Ramayanam <[email protected]>

---------

Signed-off-by: Harsha Ramayanam <[email protected]>
Co-authored-by: chen, suyue <[email protected]>
  • Loading branch information
HarshaRamayanam and chensuyue authored Aug 16, 2024
1 parent 2207503 commit 72a2553
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion comps/llms/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ docker run -it --name vllm_service -p 8008:80 -e HF_TOKEN=${HUGGINGFACEHUB_API_T
```bash
export HUGGINGFACEHUB_API_TOKEN=${your_hf_api_token}
export TRUST_REMOTE_CODE=True
docker run -it --runtime=habana --name ray_serve_service -e OMPI_MCA_btl_vader_single_copy_mechanism=none --cap-add=sys_nice --ipc=host -p 8008:80 -e HUGGINGFACEHUB_API_TOKEN=$HUGGINGFACEHUB_API_TOKEN -e TRUST_REMOTE_CODE=$TRUST_REMOTE_CODE ray_serve:habana /bin/bash -c "ray start --head && python api_server_openai.py --port_number 80 --model_id_or_path ${your_hf_llm_model} --chat_processor ${your_hf_chatprocessor}"
docker run -it --runtime=habana --name ray_serve_service -e OMPI_MCA_btl_vader_single_copy_mechanism=none --cap-add=sys_nice --ipc=host -p 8008:80 -e HUGGINGFACEHUB_API_TOKEN=$HUGGINGFACEHUB_API_TOKEN -e TRUST_REMOTE_CODE=$TRUST_REMOTE_CODE opea/llm-ray:latest /bin/bash -c "ray start --head && python api_server_openai.py --port_number 80 --model_id_or_path ${your_hf_llm_model} --chat_processor ${your_hf_chatprocessor}"
```

## 1.3 Verify the LLM Service
Expand Down
2 changes: 1 addition & 1 deletion comps/llms/text-generation/tgi/docker_compose_llm.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ services:
shm_size: 1g
command: --model-id ${LLM_MODEL_ID}
llm:
image: opea/gen-ai-comps:llm-tgi-server
image: opea/llm-tgi:latest
container_name: llm-tgi-server
ports:
- "9000:9000"
Expand Down

0 comments on commit 72a2553

Please sign in to comment.