You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there,
first of all; awesome project. super like for this :)
It may be my lack of knowledge but I am not able to figure out how to get my functions running using local LLMs. I am runing ollama and LM studio both & trying to get it working via any path.
for LMstudio with mistral models - I am getting error around 'system' role not supported.
For ollama with mistral:7b-instruct I am not getting error per say, but function is also not executing.
So, I'd like to ask how can I know more on this if funtion execution even works with local LLMs.
Thanks,
saurabh
The text was updated successfully, but these errors were encountered:
Download a model with ollama e.g mistral:7b-instruct which is the DEFAULT_MODEL. (see config file above)
ollama pull mistral:7b-instruct
list downloaded ollama models
ollama list
Example of output:
NAME ID SIZE MODIFIED
mistral:7b-instruct f974a74358d6 4.1 GB About an hour ago
llama3.1:8b-instruct-q8_0 b158ded76fa0 8.5 GB 10 days ago
deepseek-r1:14b ea35dfe18182 9.0 GB 4 weeks ago
llama3.2-vision:latest 085a1fdae525 7.9 GB 4 weeks ago
llama3.2:latest a80c4f17acd5 2.0 GB 4 weeks ago
phi4:latest ac896e5b8b34 9.1 GB 4 weeks ago
Start ollama server
ollama serve
change the modelname to one of the names above e.g phi4:latest if you don't want to use the DEFAULT_MODEL=ollama/mistral:7b-instruct
sgpt --model ollama/phi4:latest "What is the fibonacci sequence"
Hi there,
first of all; awesome project. super like for this :)
It may be my lack of knowledge but I am not able to figure out how to get my functions running using local LLMs. I am runing ollama and LM studio both & trying to get it working via any path.
for LMstudio with mistral models - I am getting error around 'system' role not supported.
For ollama with
mistral:7b-instruct
I am not getting error per say, but function is also not executing.So, I'd like to ask how can I know more on this if funtion execution even works with local LLMs.
Thanks,
saurabh
The text was updated successfully, but these errors were encountered: