-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor Model Providers to Use Shared Proxy Package + validation for OpenAI like providers #332
Refactor Model Providers to Use Shared Proxy Package + validation for OpenAI like providers #332
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like the logic consolidation and the refactoring - just have some confusion around the validation - it seems like it can never return an error (but I may be wrong here).
Object string `json:"object"` | ||
Data []any `json:"data"` | ||
} | ||
if err := json.Unmarshal(body, &modelsResp); err != nil { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can use json.NewDecoder
to decode right from the body (reader) into the struct without reading it first.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some nits and a suggestion for validation handling - overall I like the new merged approach 👍
The flow is:
|
body, err := io.ReadAll(resp.Body) | ||
if err != nil { | ||
return handleValidationError(toolPath, "Invalid Response") | ||
} | ||
|
||
var modelsResp api.ModelsResponse | ||
if err := json.Unmarshal(body, &modelsResp); err != nil { | ||
return handleValidationError(toolPath, "Invalid Response Format") | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: so we don't have to process the body twice.
body, err := io.ReadAll(resp.Body) | |
if err != nil { | |
return handleValidationError(toolPath, "Invalid Response") | |
} | |
var modelsResp api.ModelsResponse | |
if err := json.Unmarshal(body, &modelsResp); err != nil { | |
return handleValidationError(toolPath, "Invalid Response Format") | |
} | |
var modelsResp api.ModelsResponse | |
if err := json.NewDecoder(resp.Body).Decode(&modelsResp); err != nil { | |
return handleValidationError(toolPath, "Invalid Response Format") | |
} |
a4892dd
to
b5d8fdd
Compare
Remove URL cleaning from proxy package and add it directly to vLLM and Ollama providers since they're the only ones that need user-provided URL handling. Other providers use fixed API endpoints and don't need this functionality.
Co-authored-by: Thorsten Klein <[email protected]>
Co-authored-by: Donnie Adams <[email protected]>
Co-authored-by: Donnie Adams <[email protected]>
Co-authored-by: Donnie Adams <[email protected]>
Signed-off-by: Donnie Adams <[email protected]>
Signed-off-by: Donnie Adams <[email protected]>
07c4874
to
1d44104
Compare
Signed-off-by: Donnie Adams <[email protected]>
1d44104
to
92c1400
Compare
This PR consolidates and standardizes the model provider implementations by introducing a shared proxy package. Key changes:
The shared proxy package reduces code duplication, improves maintainability, and ensures consistent behavior across all model providers.