You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Error occurred while generating model response. Please try again. Error: Error: 400 litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Invalid 'metadata': too many properties. Expected an object with at most 16 properties, but got an object with 28 properties instead.", 'type': 'invalid_request_error', 'param': 'metadata', 'code': 'object_above_max_properties'}} Received Model Group=gpt-4o-mini Available Model Group Fallbacks=None
This happens with gpt-4o, gpt-4o-mini, and o3-mini
I have the same models configured via Azure OpenAI, and they work fine.
I can trigger this error from the LiteLLM playground:
What happened?
When calling OpenAI models, I get this error:
This happens with gpt-4o, gpt-4o-mini, and o3-mini
I have the same models configured via Azure OpenAI, and they work fine.
I can trigger this error from the LiteLLM playground:
Config:
Relevant log output
I have these environment variables configured:
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.60.0
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: