Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed issue #8246 #8250

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open

Fixed issue #8246 #8250

wants to merge 3 commits into from

Conversation

erezh16
Copy link

@erezh16 erezh16 commented Feb 4, 2025

Title

Added cleanup of callbacks for ad-hoc router created for requests with user_config

Relevant issues

Fixes #8246

Type

🐛 Bug Fix

Changes

  1. I added a new remove_callback_from_list_by_object() method to the LoggingCallbackManager class in logging_callback_manager.py. This function removes all methods of a given object from a given callback list.
  2. I added a new discard() method for the Router class in router.py. This is a pseudo-destructor that currently handles only cleaning up and methods of the router object from the global callback lists, e.g., litellm._async_success_callback. The method implementation is using the remove_callback_from_list_by_object() method mentioned above.
  3. In route_llm_request.py I changed the code for handling user_config starting in line 59 to call the new discard() method (mentioned above) of the user_router to do proper cleanup prior to returning the request result.

[REQUIRED] Testing - Attach a screenshot of any new tests passing locally

It's a bit hard to create a test for messages that are logged. I verified that the temp router callbacks are properly disposed of at the end of a request using a debugger. See Test procedure below.
If UI changes, send a screenshot/GIF of working UI fixes

  1. Start LiteLLM proxy server through a debugger: python -m pdb proxy_cli.py
  2. Put a breakpoint at any point in the inference path. I used b proxy_server.py:3546
  3. Run the proxy: c
  4. Send request that includes user_config. I attach a sample Python script below but masking away the keys.
import openai
import os

user_config = {
    'model_list': [
        {
            'model_name': 'rits-llama',
            'litellm_params': {
                'api_key': 'XXXXXXXXXXXXXXXXXXXX',
                'extra_headers': {
                    'RITS_API_KEY': 'XXXXXXXXXXXXXXXX'
                }
            }
        }
    ],
}

client = openai.OpenAI(
    api_key="sk-1234",
    base_url="http://0.0.0.0:4000"
)

# send request to `rits-llama`
response = client.chat.completions.create(model="rits-llama", messages = [
    {
        "role": "user",
        "content": "Who are you?"
    }
], 
    extra_body={
      "user_config": user_config
    }
) # 

print(response)
  1. When the debugger stops on the breakpoint, print one or more of the global callback lists to make sure they contain at most one callback from a router (i.e., only the global router), e.g., p litellm._async_success_callback
  2. Repeat sending requests and examining the global lists (steps 4-5) several times to make sure that no callbacks are accumulated in the global callback lists.

Copy link

vercel bot commented Feb 4, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Feb 10, 2025 10:11am

@@ -573,6 +573,20 @@ def __init__( # noqa: PLR0915
litellm.amoderation, call_type="moderation"
)

def discard(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@@ -85,6 +85,21 @@ def add_litellm_async_failure_callback(
callback=callback, parent_list=litellm._async_failure_callback
)

def remove_callback_from_list_by_object(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking at the existing tests, I believe a more suitable location for the unit test for remove_callback_from_list_by_object is https://github.com/BerriAI/litellm/blob/main/tests/litellm_utils_tests/test_logging_callback_manager.py

@krrishdholakia
Copy link
Contributor

Hi @erezh16 thank you for your work here - if you can share a screenshot of this passing your unit testing, that would be great!

@erezh16
Copy link
Author

erezh16 commented Feb 6, 2025

The unit tests are ready, but I'm having issues running tests in general. Is there an updated document on testing LiteLLM? For example, pytest claims it does not recognize pytest.mark.asyncio. There are so many errors that I don't think it got to testing the modules containing the new unit tests.

@erezh16
Copy link
Author

erezh16 commented Feb 6, 2025

@krrishdholakia I managed to execute both unit tests as specific tests and they both pass. Is this sufficient? As I commented earlier, running all tests following the README.md documentation does not work.

@erezh16
Copy link
Author

erezh16 commented Feb 6, 2025

Here is a screenshot of test_discard() passing.
image

@erezh16
Copy link
Author

erezh16 commented Feb 6, 2025

Here is a screenshot of test_remove_callback_from_list_by_object() passing
image

@erezh16
Copy link
Author

erezh16 commented Feb 6, 2025

@krrishdholakia if you tell me how to do full testing, I will do it. The documentation on that is broken, and the command from the CircleCI setup is not working either.

@krrishdholakia
Copy link
Contributor

Specific unit test is fine - i can take care of the rest - we can do a better job here with contribution guides / testing.

Thanks for the feedback

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: Accumulating LiteLLM callbacks from ad-hoc routers from requests with user_config
2 participants