-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(vertex_ai/gemini): improve chunk parsing for streaming responses #8401
base: main
Are you sure you want to change the base?
fix(vertex_ai/gemini): improve chunk parsing for streaming responses #8401
Conversation
Fix incorrect removal of "data:" strings within response content.
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please a add a test and / or unit test
chunk = chunk.replace("data:", "") | ||
chunk = chunk.strip() | ||
if chunk.startswith("data:"): | ||
chunk = chunk[len("data:"):].strip() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
.strip()
Is this needed ? Wouldn't this remove an start / end empty space that actually came from llm ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the case where self.chunk_type == "valid_json"
, the strip wouldn't have any effect on the llm message content itself (added testing for it). I updated the handling to be more precise.
Enhance test coverage for ModelResponseIterator's chunk parsing logic, focusing on preserving 'data:' prefixes and surrounding spaces in Vertex AI and Google Studio Gemini streaming responses
Added tests that cover cases where I understand that in the case of chunk 1: chunk 2: chunk 3: chunk 4: When chunk 1: chunk 2: I haven't encountered a situation where the code goes down this path so your input would help me add a test for that case too. |
This PR fixes bug #8143 by updating the vertex_ai streaming chunk parsing logic to remove only a leading
data:
prefix.Relevant issues
#8143
Type
🐛 Bug Fix
Changes