Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM tool call not shown for streamed response #542

Open
jackmpcollins opened this issue Oct 25, 2024 · 2 comments · May be fixed by #545
Open

LLM tool call not shown for streamed response #542

jackmpcollins opened this issue Oct 25, 2024 · 2 comments · May be fixed by #545
Labels
Bug Bug related to the Logfire Python SDK

Comments

@jackmpcollins
Copy link
Contributor

jackmpcollins commented Oct 25, 2024

Description

The Logfire UI nicely shows the tool call by an LLM for non-streamed responses

image

But for streamed responses the Assistant box is empty.

image

Code to reproduce

# Test logfire streamed responsea

from openai import Client

import logfire

logfire.configure()
logfire.instrument_openai()

client = Client()

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Create a Superhero named Monkey Boy."}],
    stream=True,
    stream_options={"include_usage": True},
    tool_choice={"type": "function", "function": {"name": "return_superhero"}},
    tools=[
        {
            "type": "function",
            "function": {
                "name": "return_superhero",
                "parameters": {
                    "properties": {
                        "name": {"title": "Name", "type": "string"},
                        "age": {"title": "Age", "type": "integer"},
                        "power": {"title": "Power", "type": "string"},
                        "enemies": {
                            "items": {"type": "string"},
                            "title": "Enemies",
                            "type": "array",
                        },
                    },
                    "required": ["name", "age", "power", "enemies"],
                    "type": "object",
                },
            },
        },
    ],
)
for chunk in response:
    print(chunk)

Related (closed) issue: #54

Python, Logfire & OS Versions, related packages (not required)

logfire="0.50.1"
platform="macOS-15.0.1-arm64-arm-64bit"
python="3.10.12 (main, Jul 15 2023, 09:54:16) [Clang 14.0.3 
(clang-1403.0.22.14.1)]"
[related_packages]
requests="2.32.3"
pydantic="2.8.2"
openai="1.52.0"
protobuf="4.25.3"
rich="13.7.1"
tomli="2.0.1"
executing="2.0.1"
opentelemetry-api="1.25.0"
opentelemetry-exporter-otlp-proto-common="1.25.0"
opentelemetry-exporter-otlp-proto-http="1.25.0"
opentelemetry-instrumentation="0.46b0"
opentelemetry-proto="1.25.0"
opentelemetry-sdk="1.25.0"
opentelemetry-semantic-conventions="0.46b0"
@jackmpcollins jackmpcollins added the Bug Bug related to the Logfire Python SDK label Oct 25, 2024
@alexmojaki
Copy link
Contributor

I'm AFK but this sounds familiar and I see an old logfire version, can you please check if this still happens with the latest versions of logfire and openai?

@jackmpcollins jackmpcollins linked a pull request Oct 27, 2024 that will close this issue
@jackmpcollins
Copy link
Contributor Author

Thanks, I should have checked with latest versions. It is still an issue. I have opened PR #545 now to address this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Bug related to the Logfire Python SDK
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants