This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
[Bug]: reasoning_effort parameter not available for omni model via Unify #5647
Labels
🐛 bug
Something isn't working
What happened?
It looks like
reasoning_effort
parameter should be supported, but I don't see it available when I use theo3-mini
model via the Unify provider.Model name:
o3-mini@openai
Related code: https://github.com/danny-avila/LibreChat/blob/0312d4f4f479f6ffde65c6d617d380ed55d8fec5/api/app/clients/OpenAIClient.js#L108,L109
Maybe this regex just needs to be tweaked?
Relevant discussion: #5623
Steps to Reproduce
Add the following to your
librechat.yml
:What browsers are you seeing the problem on?
No response
Relevant log output
Screenshots
Code of Conduct
The text was updated successfully, but these errors were encountered: