Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor Model Providers to Use Shared Proxy Package + validation for OpenAI like providers #332

Merged
merged 31 commits into from
Jan 17, 2025

Conversation

sanjay920
Copy link
Contributor

@sanjay920 sanjay920 commented Jan 10, 2025

This PR consolidates and standardizes the model provider implementations by introducing a shared proxy package. Key changes:

  • Created a shared proxy package from OpenAI provider's server code
  • Migrated Groq, DeepSeek, xAI, Ollama, and vLLM providers from their individual implementations to use the shared proxy
  • Added consistent validation support across all relevant providers. (6 birds, 1 stone WRT Add validation to openi-like model providers obot#1158)
  • Removed duplicate server implementations
  • Converted Python-based providers (Groq, Ollama) to Go
  • Added model filtering support for provider-specific model types

The shared proxy package reduces code duplication, improves maintainability, and ensures consistent behavior across all model providers.

@sanjay920 sanjay920 changed the title refactor openai model provider Refactor Model Providers to Use Shared Proxy Package Jan 10, 2025
@sanjay920 sanjay920 changed the title Refactor Model Providers to Use Shared Proxy Package Refactor Model Providers to Use Shared Proxy Package + validation for OpenAI like providers Jan 10, 2025
@sanjay920 sanjay920 marked this pull request as ready for review January 10, 2025 23:41
Copy link
Contributor

@iwilltry42 iwilltry42 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the logic consolidation and the refactoring - just have some confusion around the validation - it seems like it can never return an error (but I may be wrong here).

openai-model-provider/proxy/validate.go Outdated Show resolved Hide resolved
openai-model-provider/proxy/validate.go Outdated Show resolved Hide resolved
Object string `json:"object"`
Data []any `json:"data"`
}
if err := json.Unmarshal(body, &modelsResp); err != nil {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can use json.NewDecoder to decode right from the body (reader) into the struct without reading it first.

xai-model-provider/main.go Outdated Show resolved Hide resolved
@sanjay920 sanjay920 requested a review from iwilltry42 January 15, 2025 06:31
Copy link
Contributor

@iwilltry42 iwilltry42 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some nits and a suggestion for validation handling - overall I like the new merged approach 👍

openai-model-provider/proxy/validate.go Outdated Show resolved Hide resolved
openai-model-provider/proxy/validate.go Outdated Show resolved Hide resolved
openai-model-provider/proxy/proxy.go Show resolved Hide resolved
deepseek-model-provider/main.go Outdated Show resolved Hide resolved
deepseek-model-provider/main.go Outdated Show resolved Hide resolved
ollama-model-provider/go.mod Show resolved Hide resolved
openai-model-provider/proxy/proxy.go Outdated Show resolved Hide resolved
openai-model-provider/proxy/proxy.go Outdated Show resolved Hide resolved
openai-model-provider/proxy/proxy.go Outdated Show resolved Hide resolved
openai-model-provider/proxy/rewrite.go Outdated Show resolved Hide resolved
openai-model-provider/proxy/validate.go Outdated Show resolved Hide resolved
@sanjay920
Copy link
Contributor Author

The flow is:

  1. Tool is invoked with validate argument
  2. Config is created with provider-specific settings (host, API key, etc.)
  3. Validate() is called which:
    • Makes HTTP request to provider's models endpoint
    • If any error occurs (bad credentials, network error, etc.):
      • Logs the error (for debugging)
      • Prints JSON-formatted error (for tool consumption)
      • Returns error to trigger exit(1)
  4. Main program exits with status 1 if validation failed

Comment on lines 46 to 48
body, err := io.ReadAll(resp.Body)
if err != nil {
return handleValidationError(toolPath, "Invalid Response")
}

var modelsResp api.ModelsResponse
if err := json.Unmarshal(body, &modelsResp); err != nil {
return handleValidationError(toolPath, "Invalid Response Format")
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: so we don't have to process the body twice.

Suggested change
body, err := io.ReadAll(resp.Body)
if err != nil {
return handleValidationError(toolPath, "Invalid Response")
}
var modelsResp api.ModelsResponse
if err := json.Unmarshal(body, &modelsResp); err != nil {
return handleValidationError(toolPath, "Invalid Response Format")
}
var modelsResp api.ModelsResponse
if err := json.NewDecoder(resp.Body).Decode(&modelsResp); err != nil {
return handleValidationError(toolPath, "Invalid Response Format")
}

@thedadams thedadams force-pushed the openai-proxy-refactor branch 5 times, most recently from 07c4874 to 1d44104 Compare January 17, 2025 18:59
Signed-off-by: Donnie Adams <[email protected]>
@thedadams thedadams force-pushed the openai-proxy-refactor branch from 1d44104 to 92c1400 Compare January 17, 2025 18:59
@thedadams thedadams merged commit 7b16a5c into obot-platform:main Jan 17, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants