-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Roadmap]LocalAI access o1mini , o1 #711
Comments
Hi @wac81, please add some details. As it stands this doesn't seem to be a Big-AGI issue as we support LocalAI as well as o1 models. |
use third party provider url is https://api.myhispreadnlp.com/ can use gpt4o, but not o1mini,o1 try to access o1, get error: |
in other way , or i can setup proxy to access this thrid party url in configure to replace the url of openai itself? Is this feasible? |
@wac81 instead of a LocalAI service, can you set up an OpenAI service with a custom server URL? It may fix the problem. The o1 and o1-preview models have breaking API changes that are not discoverable and require client tweaks, and we apply them in "openai" services, not localai services. This is the case in the error above, as openai rejects the 'system' message in o1 models. Lmk if this fixes it already. |
Okay, I'll give it a try |
@wac81 please let me know if you need further support on this. |
Description
I have a third-party openai provider and I use localai to access the third-party url and key.
however found LocalAI cannot access o1mini , o1.
Requirements
LocalAI can access o1mini , o1
The text was updated successfully, but these errors were encountered: