-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chore: Add openai o1 support #366
Conversation
4a4025b
to
f0ab964
Compare
openai-model-provider/proxy/proxy.go
Outdated
@@ -96,6 +101,41 @@ func (s *server) proxyDirector(req *http.Request) { | |||
if s.cfg.PathPrefix != "" && !strings.HasPrefix(req.URL.Path, s.cfg.PathPrefix) { | |||
req.URL.Path = s.cfg.PathPrefix + req.URL.Path | |||
} | |||
|
|||
if req.Body == nil || req.Method != http.MethodPost { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does it make sense to do this?
if req.Body == nil || req.Method != http.MethodPost { | |
if req.Body == nil || s.cfg.UpstreamHost != "api.openai.com" || req.URL.Path != "/v1/chat/completion" { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this, because it means that we can look for errors when we try to unmarshal the body later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yep, I will change this.
aa67dc3
to
bd15449
Compare
@thedadams Pushed a change that make the response compatiable with stream client in gptscript. |
openai-model-provider/proxy/proxy.go
Outdated
return nil | ||
} | ||
|
||
func isModelO1(model string) bool { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The current PR only supports o1, as o1-mini doesn't seem to support system message which could screw a lot of things. We could investigate in a different issue.
bd15449
to
5a2759d
Compare
openai-model-provider/proxy/proxy.go
Outdated
} | ||
|
||
func (s *server) modifyResponse(resp *http.Response) error { | ||
if resp.StatusCode != http.StatusOK || resp.Body == nil || resp.Request.URL.Path != chatCompletionsPath || resp.Request.URL.Host != openaiBaseHostName { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: resp.Body
is always non-nil.
if resp.StatusCode != http.StatusOK || resp.Body == nil || resp.Request.URL.Path != chatCompletionsPath || resp.Request.URL.Host != openaiBaseHostName { | |
if resp.StatusCode != http.StatusOK || resp.Request.URL.Path != chatCompletionsPath || resp.Request.URL.Host != openaiBaseHostName { |
openai-model-provider/proxy/proxy.go
Outdated
rawBody, err := io.ReadAll(resp.Body) | ||
if err != nil { | ||
return fmt.Errorf("failed to read response body: %w", err) | ||
} | ||
if err := json.Unmarshal(rawBody, &respBody); err == nil && isModelO1(respBody.Model) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So we only have to process the body once.
rawBody, err := io.ReadAll(resp.Body) | |
if err != nil { | |
return fmt.Errorf("failed to read response body: %w", err) | |
} | |
if err := json.Unmarshal(rawBody, &respBody); err == nil && isModelO1(respBody.Model) { | |
if err := json.Decode(resp.Body).Decode(&respBody); err == nil && isModelO1(respBody.Model) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It needs to save the original response so that in case it is not a O1 model, restore the resp.Body to the original data.
openai-model-provider/proxy/proxy.go
Outdated
} | ||
|
||
if resp.Header.Get("Content-Type") == "application/json" { | ||
var respBody openai.ChatCompletionResponse |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Have to close the body if we are reading and replacing it. I think you need to save the original body in a variable so you don't close the new body.
var respBody openai.ChatCompletionResponse | |
originalBody := resp.Body | |
defer originalBody.Close() | |
var respBody openai.ChatCompletionResponse |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, probably better to just close the original body without using defer
openai-model-provider/proxy/proxy.go
Outdated
Choices: func() []openai.ChatCompletionStreamChoice { | ||
var choices []openai.ChatCompletionStreamChoice | ||
for _, choice := range respBody.Choices { | ||
choices = append(choices, openai.ChatCompletionStreamChoice{ | ||
Index: choice.Index, | ||
Delta: openai.ChatCompletionStreamChoiceDelta{ | ||
Content: choice.Message.Content, | ||
Role: choice.Message.Role, | ||
FunctionCall: choice.Message.FunctionCall, | ||
ToolCalls: choice.Message.ToolCalls, | ||
}, | ||
FinishReason: choice.FinishReason, | ||
}) | ||
} | ||
return choices | ||
}(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Whoa! I've never seen this before. I think I like it.
openai-model-provider/proxy/proxy.go
Outdated
@@ -63,7 +74,8 @@ func Run(cfg *Config) error { | |||
ModifyResponse: cfg.RewriteModelsFn, | |||
}) | |||
mux.Handle("/v1/", &httputil.ReverseProxy{ | |||
Director: s.proxyDirector, | |||
Director: s.proxyDirector, | |||
ModifyResponse: s.modifyResponse, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it would be good if we could find a way to not put this on every model provider that uses this proxy code, and only put it on the OpenAI model provider.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, it would need us to put the common code into a seperate folder/repo with its own go modules. Right now if I can update dep for openai-model-provider it pulls other tools in.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wait, actually, yeah I didn't realize the proxy code is shared. Let me think..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, thorsten's custom path comes right on time. This switches to use that.
1d3f905
to
162d589
Compare
Signed-off-by: Daishan Peng <daishan@acorn.io>
Signed-off-by: Daishan Peng <daishan@acorn.io>
Signed-off-by: Daishan Peng <daishan@acorn.io>
Signed-off-by: Daishan Peng <daishan@acorn.io>
162d589
to
46eee43
Compare
// Register default handlers only if they are not already registered | ||
if _, exists := cfg.CustomPathHandleFuncs["/{$}"]; !exists { | ||
mux.HandleFunc("/{$}", s.healthz) | ||
} | ||
if _, exists := cfg.CustomPathHandleFuncs["/v1/models"]; !exists { | ||
if handler, exists := cfg.CustomPathHandleFuncs["/v1/models"]; !exists { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@iwilltry42 can you review this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
I don't like all those if-else statements, but have no cleaner solution right now (unfortunately, mux doesn't take care of the path sorting and neither allows overrides nor looking up what paths are already handled.
openai-model-provider/proxy/proxy.go
Outdated
type Config struct { | ||
url *url.URL | ||
Url *url.URL |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit:
Url *url.URL | |
URL *url.URL |
Signed-off-by: Daishan Peng <daishan@acorn.io>
Hello, since this PR has been opened #378 has been merged. while not being certain it needs to, but should generic-openai-model-provider also be changed ? |
@pixeye33 that would only make sense, if one were to use the generic provider to connect to OpenAI directly instead of using the OpenAI provider, right? Now that I think about it - it may be needed for Azure OpenAI which has access to o1 as well. |
For this PR, at least Ollama and generic-openai-model-provider should be identical (or even be just one). But, for my specific case : i'm using https://www.litellm.ai/ in front of groq/OpenAI for all the AI tools i'm playing with, and i'd like to use generic-openai-model-provider to use it for obot too (Same logic applies when using https://openrouter.ai/). However, i'm not even sure that a o1 specific fix is necessary/done on litellm : i have not tried yet. |
Add openai o1 support in proxy to modify request body.
obot-platform/obot#1131