Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for ollama, and oh-my-zh plugin loading #5

Open
wants to merge 59 commits into
base: master
Choose a base branch
from

Conversation

p1r473
Copy link

@p1r473 p1r473 commented May 3, 2024

Add support for ollama, and oh-my-zh plugin loading
Ollama is local (offline) models. So no online needed.
Can declare ollama model with export ZSH_LLM_SUGGESTION_MODEL="tinydolphin"
export ZSH_LLM_SUGGESTION_SERVER="localhost:11434" #configure server
Wipe history with alias wipe='rm ~/.ollama_history'

Can bind keys:

bindkey '^o' zsh_llm_suggestions_ollama
bindkey '^[^o' zsh_llm_suggestions_ollama_explain
bindkey '^p' zsh_llm_suggestions_ollama_freestyle #new freestyle mode allows for querying the LLM without asking for ZSH help

quick getting started guide for ollama:
curl -fsSL https://ollama.com/install.sh | sh #install
ollama pull MODELNAME #download models - tinyllama or tinydolphin are small if you are running on a Pi
ollama serve & #this should be done automatically with systemctl
other tips
export OLLAMA_MODELS="/home/pi/ollama-models/" #can configure where to put your models
ollama pull --upgrade-all #update all llms
curl -XPOST localhost:11434/api/generate -d '{"model": "tinyllama", "prompt": "Why is the sky blue?", "stream": false }' #test your ollama

@p1r473 p1r473 mentioned this pull request May 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant