Skip to content

Issues: containers/ai-lab-recipes

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Adding a "ReAct agent" recipe
#807 opened Oct 29, 2024 by suppathak
Computer Vision - file format errors
#762 opened Aug 15, 2024 by ScrewTSW
WIP: Add GraphRAG recipe
#756 opened Aug 12, 2024 by srampal
Remove Deepspeed and VLLM
#574 opened Jun 17, 2024 by cooktheryan
MODEL_ENDPOINT hard coded for Makefile run command enhancement Improve on an existing feature or experience
#561 opened Jun 13, 2024 by HunterGerlach
llama-cpp-server broken bug Something isn't working
#547 opened Jun 11, 2024 by jeffmaury
build a RHEL based Milvus enhancement Improve on an existing feature or experience
#538 opened Jun 7, 2024 by cooktheryan
https://quay.io/repository/ai-lab/llamacpp-python-cuda is provided only for x86_64 bug Something isn't working enhancement Improve on an existing feature or experience
#525 opened Jun 3, 2024 by jeffmaury
ilab serve with vllm - error on 3x GPU system bug Something isn't working
#514 opened May 23, 2024 by markmc
Recipe idea: LLM agent framework feature Ask for totally new functionality
#500 opened May 17, 2024 by hemajv
ProTip! no:milestone will show everything without a milestone.