Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[LLAMA 502] llama command are not getting recognized #187

Open
Seemachauhan13 opened this issue Oct 23, 2024 · 3 comments
Open

[LLAMA 502] llama command are not getting recognized #187

Seemachauhan13 opened this issue Oct 23, 2024 · 3 comments

Comments

@Seemachauhan13
Copy link

I have run pip install llama-stack, following readme file, though it is installed successfully ,
"llama model list --show-all
llama: command not found"
But I see llama commnad are not getting recognized, need to get this resolved toi proceed further
//*****************************
Linux terminal#:/llama-stack$ pip install llama-stack
Requirement already satisfied: llama-stack in /home/seema1/.local/lib/python3.8/site-packages (0.0.1a5)
Requirement already satisfied: httpx<1,>=0.23.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (0.27.2)
Requirement already satisfied: pydantic<3,>=1.9.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (2.9.2)
Requirement already satisfied: distro<2,>=1.7.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (1.9.0)
Requirement already satisfied: sniffio in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (1.3.1)
Requirement already satisfied: anyio<5,>=3.5.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (4.5.2)
Requirement already satisfied: typing-extensions<5,>=4.7 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (4.12.2)
Requirement already satisfied: idna in /usr/lib/python3/dist-packages (from httpx<1,>=0.23.0->llama-stack) (2.8)
Requirement already satisfied: httpcore==1.* in /home/seema1/.local/lib/python3.8/site-packages (from httpx<1,>=0.23.0->llama-stack) (1.0.6)
Requirement already satisfied: certifi in /usr/lib/python3/dist-packages (from httpx<1,>=0.23.0->llama-stack) (2019.11.28)
Requirement already satisfied: pydantic-core==2.23.4 in /home/seema1/.local/lib/python3.8/site-packages (from pydantic<3,>=1.9.0->llama-stack) (2.23.4)
Requirement already satisfied: annotated-types>=0.6.0 in /home/seema1/.local/lib/python3.8/site-packages (from pydantic<3,>=1.9.0->llama-stack) (0.7.0)
Requirement already satisfied: exceptiongroup>=1.0.2; python_version < "3.11" in /home/seema1/.local/lib/python3.8/site-packages (from anyio<5,>=3.5.0->llama-stack) (1.2.2)
Requirement already satisfied: h11<0.15,>=0.13 in /home/seema1/.local/lib/python3.8/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->llama-stack) (0.14.0)
Liunx terminal#:
/llama-stack$ llama model list --show-all
llama: command not found

@skmda37
Copy link

skmda37 commented Oct 23, 2024

I have the same problem

@cglagovichTT
Copy link

I've found that it silently fails if anything is wrong with your env. Try ensuring that you have installed python3.10, then create a python3.10 venv and build from source with pip install -e .

@monchewharry
Copy link

Create an environment with python3.10 and install it:

conda create -n stack python=3.10
pip install llama-stack

Refer to: https://github.com/meta-llama/llama-stack/blob/main/docs/getting_started.md
The current installation code won't check the python version.....

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants