Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any plans to add self-hosted LLM options to AIGateway K8s custom resource? #1087

Open
1 task
davidxia opened this issue Jan 23, 2025 · 0 comments
Open
1 task

Comments

@davidxia
Copy link

Problem Statement

I want to configure self-hosted LLM backends with Kong. I deployed Kong with the Gateway Operator.

I see the AIGateway K8s custom resource has .spec.largeLanguageModels.cloudHosted as a field. The description states

This is currently a required field, requiring at least one cloud-hosted LLM be specified, however in future iterations we may add other hosting options such as self-hosted LLMs as separate fields.

Are there any plans to add self-hosted LLM options to AIGateway? Are there other ways to configure self-hosted LLM backends with Kong deployed with Gateway Operator?

Acceptance Criteria

  • I can configure self-hosted LLM backends with the AIGateway CR
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant