-
Notifications
You must be signed in to change notification settings - Fork 522
Issues: meta-llama/llama-stack
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Guardrail Loading Failed with Unexpected Large GPU Memory Requirement at Multi-GPU Server
#328
opened Oct 25, 2024 by
dawenxi-007
2 tasks
Server webmethod endpoint and llama-stack-spec.yaml file mismatch
#322
opened Oct 25, 2024 by
cheesecake100201
2 tasks
What configs input when build from distributions/meta-reference-gpu/build.yaml
#321
opened Oct 25, 2024 by
AlexHe99
2 tasks
Create a remote memory provider for pinecone
good first issue
Good for newcomers
#268
opened Oct 18, 2024 by
raghotham
pytorch CUDA not found in host that has CUDA with working pytorch
question
Further information is requested
#257
opened Oct 16, 2024 by
nikolaydubina
missing target image architecture
good first issue
Good for newcomers
#253
opened Oct 16, 2024 by
nikolaydubina
[W socket.cpp:697] [c10d] The IPv6 network addresses of (1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.ip6.arpa, 50714) cannot be retrieved (gai error: 8 - nodename nor servname provided, or not known).
question
Further information is requested
#248
opened Oct 15, 2024 by
Rohan-Jalil
AttributeError: 'ChatCompletionResponse' object has no attribute 'event'
question
Further information is requested
#246
opened Oct 14, 2024 by
AI-Aether
Tool Registry for Agents
enhancement
New feature or request
#234
opened Oct 10, 2024 by
onkarbhardwaj
Add top_k output tokens w/ corresponding logprobs
enhancement
New feature or request
#214
opened Oct 8, 2024 by
yanxi0830
vllm: expand configuration support
enhancement
New feature or request
#208
opened Oct 7, 2024 by
russellb
vllm: improve container support
enhancement
New feature or request
#200
opened Oct 6, 2024 by
russellb
vllm: test and fix tool support
enhancement
New feature or request
#199
opened Oct 6, 2024 by
russellb
Previous Next
ProTip!
no:milestone will show everything without a milestone.