reasoning_effort parameter not available for omni model via Unify [requires update] #5677
Unanswered
illgitthat
asked this question in
Troubleshooting
Replies: 1 comment
-
This was already resolved and would require an update. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What happened?
It looks like
reasoning_effort
parameter should be supported, but I don't see it available when I use theo3-mini
model via the Unify provider.Model name:
o3-mini@openai
Related code: https://github.com/danny-avila/LibreChat/blob/0312d4f4f479f6ffde65c6d617d380ed55d8fec5/api/app/clients/OpenAIClient.js#L108,L109
Maybe this regex just needs to be tweaked?
Relevant discussion: #5623
Steps to Reproduce
Add the following to your
librechat.yml
:What browsers are you seeing the problem on?
No response
Relevant log output
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions