You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have run pip install llama-stack, following readme file, though it is installed successfully ,
"llama model list --show-all
llama: command not found"
But I see llama commnad are not getting recognized, need to get this resolved toi proceed further
//*****************************
Linux terminal#:/llama-stack$ pip install llama-stack
Requirement already satisfied: llama-stack in /home/seema1/.local/lib/python3.8/site-packages (0.0.1a5)
Requirement already satisfied: httpx<1,>=0.23.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (0.27.2)
Requirement already satisfied: pydantic<3,>=1.9.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (2.9.2)
Requirement already satisfied: distro<2,>=1.7.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (1.9.0)
Requirement already satisfied: sniffio in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (1.3.1)
Requirement already satisfied: anyio<5,>=3.5.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (4.5.2)
Requirement already satisfied: typing-extensions<5,>=4.7 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (4.12.2)
Requirement already satisfied: idna in /usr/lib/python3/dist-packages (from httpx<1,>=0.23.0->llama-stack) (2.8)
Requirement already satisfied: httpcore==1.* in /home/seema1/.local/lib/python3.8/site-packages (from httpx<1,>=0.23.0->llama-stack) (1.0.6)
Requirement already satisfied: certifi in /usr/lib/python3/dist-packages (from httpx<1,>=0.23.0->llama-stack) (2019.11.28)
Requirement already satisfied: pydantic-core==2.23.4 in /home/seema1/.local/lib/python3.8/site-packages (from pydantic<3,>=1.9.0->llama-stack) (2.23.4)
Requirement already satisfied: annotated-types>=0.6.0 in /home/seema1/.local/lib/python3.8/site-packages (from pydantic<3,>=1.9.0->llama-stack) (0.7.0)
Requirement already satisfied: exceptiongroup>=1.0.2; python_version < "3.11" in /home/seema1/.local/lib/python3.8/site-packages (from anyio<5,>=3.5.0->llama-stack) (1.2.2)
Requirement already satisfied: h11<0.15,>=0.13 in /home/seema1/.local/lib/python3.8/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->llama-stack) (0.14.0)
Liunx terminal#:/llama-stack$ llama model list --show-all
llama: command not found
The text was updated successfully, but these errors were encountered:
I've found that it silently fails if anything is wrong with your env. Try ensuring that you have installed python3.10, then create a python3.10 venv and build from source with pip install -e .
I have run pip install llama-stack, following readme file, though it is installed successfully ,
"llama model list --show-all
llama: command not found"
But I see llama commnad are not getting recognized, need to get this resolved toi proceed further
//*****************************
Linux terminal#:
/llama-stack$ pip install llama-stack/llama-stack$ llama model list --show-allRequirement already satisfied: llama-stack in /home/seema1/.local/lib/python3.8/site-packages (0.0.1a5)
Requirement already satisfied: httpx<1,>=0.23.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (0.27.2)
Requirement already satisfied: pydantic<3,>=1.9.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (2.9.2)
Requirement already satisfied: distro<2,>=1.7.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (1.9.0)
Requirement already satisfied: sniffio in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (1.3.1)
Requirement already satisfied: anyio<5,>=3.5.0 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (4.5.2)
Requirement already satisfied: typing-extensions<5,>=4.7 in /home/seema1/.local/lib/python3.8/site-packages (from llama-stack) (4.12.2)
Requirement already satisfied: idna in /usr/lib/python3/dist-packages (from httpx<1,>=0.23.0->llama-stack) (2.8)
Requirement already satisfied: httpcore==1.* in /home/seema1/.local/lib/python3.8/site-packages (from httpx<1,>=0.23.0->llama-stack) (1.0.6)
Requirement already satisfied: certifi in /usr/lib/python3/dist-packages (from httpx<1,>=0.23.0->llama-stack) (2019.11.28)
Requirement already satisfied: pydantic-core==2.23.4 in /home/seema1/.local/lib/python3.8/site-packages (from pydantic<3,>=1.9.0->llama-stack) (2.23.4)
Requirement already satisfied: annotated-types>=0.6.0 in /home/seema1/.local/lib/python3.8/site-packages (from pydantic<3,>=1.9.0->llama-stack) (0.7.0)
Requirement already satisfied: exceptiongroup>=1.0.2; python_version < "3.11" in /home/seema1/.local/lib/python3.8/site-packages (from anyio<5,>=3.5.0->llama-stack) (1.2.2)
Requirement already satisfied: h11<0.15,>=0.13 in /home/seema1/.local/lib/python3.8/site-packages (from httpcore==1.*->httpx<1,>=0.23.0->llama-stack) (0.14.0)
Liunx terminal#:
llama: command not found
The text was updated successfully, but these errors were encountered: