Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enforcing cost limits #1152

Open
jjallaire opened this issue Jan 19, 2025 · 2 comments
Open

Enforcing cost limits #1152

jjallaire opened this issue Jan 19, 2025 · 2 comments
Assignees

Comments

@jjallaire
Copy link
Collaborator

Set a cost limit for a sample or a task and have it automatically terminate when the limit is reached.

Note that there is no general way to determine the pricing of models dynamically (and if we hard code them into the package they are likely to be wrong). Therefore we'll likely want users to let us know what their per-token cost is when using this feature.

@Taytay
Copy link

Taytay commented Feb 5, 2025

LiteLLM handles pricing of various models with a json file they keep and serve here: https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json

You can add to it or change it at runtime.

I'm fine keep this on the user for now if that makes this easier to build. But BOY, I would love to see a token AND price breakdown after the fact!

@zarSou9
Copy link

zarSou9 commented Feb 8, 2025

I wonder if it would be worth switching to use LiteLLM in general

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants