Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: additional_drop_params not working in proxy mode #8199

Open
uzhao opened this issue Feb 2, 2025 · 0 comments
Open

[Bug]: additional_drop_params not working in proxy mode #8199

uzhao opened this issue Feb 2, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@uzhao
Copy link

uzhao commented Feb 2, 2025

What happened?

After I upgrade to 1.60, this param no longer working for me.
In my proxy config, I have

  - model_name: o3-mini
    litellm_params:
      model: github/o3-mini
      api_key: "minimini"
      max_completion_tokens: 100000
      reasoning_effort: "high"
      additional_drop_params: ["max_tokens", "temperature", "top_p"]
    model_info:
      id: github-o3-mini

litellm_settings:
  drop_params: true
  request_timeout: 300
  set_verbose: true

Then

curl http://127.0.0.1:4000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer bigbig" \
  -d '{
    "model": "o3-mini",
    "messages": [
      {
        "role": "system",
        "content": "You are a helpful assistant."
      },
      {
        "role": "user",
        "content": "Hello!"
      }
    ],
    "max_completion_tokens": 100000,
    "reasoning_effort": "high"
  }'

I got such a return
{"error":{"message":"litellm.BadRequestError: GithubException - Error code: 400 - {'error': {'message': \"Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.\", 'type': 'invalid_request_error', 'param': 'max_tokens', 'code': 'unsupported_parameter'...

I tried downgrade to 1.59.10 but this still not solved. I remember I modified proxy_server.py in older version and manually removed these params but this no longer working

Relevant log output

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

1.60

Twitter / LinkedIn details

No response

@uzhao uzhao added the bug Something isn't working label Feb 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant