From f7708d3508547252f17ecfef5f29f58b121c23d5 Mon Sep 17 00:00:00 2001 From: Houssem Dellai Date: Fri, 20 Sep 2024 09:07:11 +0200 Subject: [PATCH] 510 --- 510_ai_ollama_k8s/Readme.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/510_ai_ollama_k8s/Readme.md b/510_ai_ollama_k8s/Readme.md index b236e39..a4057e0 100644 --- a/510_ai_ollama_k8s/Readme.md +++ b/510_ai_ollama_k8s/Readme.md @@ -68,6 +68,8 @@ kubectl exec ollama-0 -n ollama -it -- ollama run llama3.1 kubectl get svc -n ollama ``` +Now you can navigate to the public IP of the client service to chat with the model. + Here are some example models that can be used in `ollama` [available here](https://github.com/ollama/ollama/blob/main/README.md#model-library): | Model | Parameters | Size | Download |