-
Notifications
You must be signed in to change notification settings - Fork 446
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improving Ollama nested model behavior #621
Improving Ollama nested model behavior #621
Conversation
Quality Gate passedIssues Measures |
Please can you explain what this is doing? Ideally with an example of what Ollama returns. |
I'm no fan of the Ollama Python library, it's not properly tested and it's indented with 2 spaces I'd like to keep using the OpenAI SDK if we can. |
Thank you for taking the time to review this. Great work. However, even with this modification, I am unable to get the provided example to work. It seems, as mentioned in issue #242, that Ollama AI struggles to return something reliable. I’ve used the Ollama API extensively for structured requests without encountering issues, but I understand that code quality issues are critical in this context. Here is the full trace for the example I provided (It’s clear that on the second attempt, Ollama loses its way and produces random output.) Full Trace
|
Closing this PR. After testing it more thoroughly I'm afraid it's just a bandaid fix at best and doesn't actually address the larger issue of Ollama ignoring tools (see #242). |
This is just a first attempt at improving how nested object models are handled via Ollama. Very unsure about whether or not this is the best approach, but I wanted to at least take a stab at it before asking for feedback.
Addresses #607, but nested model performance with Ollama could still be improved a lot beyond what this PR adds. This fixes issues where the Ollama response follows the specified data model but provides oddly-formatted JSON responses. It does not, however, address the fact that Ollama just doesn't seem to like adhering to the model to begin with.