Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: ModuleNotFoundError: No module named 'cgi' #8198

Closed
Rishang opened this issue Feb 2, 2025 · 3 comments
Closed

[Bug]: ModuleNotFoundError: No module named 'cgi' #8198

Rishang opened this issue Feb 2, 2025 · 3 comments
Labels
bug Something isn't working

Comments

@Rishang
Copy link

Rishang commented Feb 2, 2025

What happened?

Python version: 3.13
OS: Fedora 41

Litellm is unable to work in the latest version facing this import issue. https://docs.python.org/3/library/cgi.html

I found a work around for this but still reporting for a better solution.
https://stackoverflow.com/questions/78635741/drop-in-replacement-for-the-cgi-module

Relevant log output

Traceback (most recent call last):
  File "/home/noobi/.local/bin/heygpt", line 5, in <module>
    from heygpt.cli import app
  File "/home/noobi/.local/lib/python3.13/site-packages/heygpt/cli.py", line 16, in <module>
    from heygpt.core import (
    ...<6 lines>...
    )
  File "/home/noobi/.local/lib/python3.13/site-packages/heygpt/core.py", line 8, in <module>
    from litellm import completion
  File "/home/noobi/.local/lib/python3.13/site-packages/litellm/__init__.py", line 779, in <module>
    from .llms.bedrock.chat.converse_transformation import AmazonConverseConfig
  File "/home/noobi/.local/lib/python3.13/site-packages/litellm/llms/bedrock/chat/__init__.py", line 1, in <module>
    from .converse_handler import BedrockConverseLLM
  File "/home/noobi/.local/lib/python3.13/site-packages/litellm/llms/bedrock/chat/converse_handler.py", line 20, in <module>
    from .invoke_handler import AWSEventStreamDecoder, MockResponseIterator, make_call
  File "/home/noobi/.local/lib/python3.13/site-packages/litellm/llms/bedrock/chat/invoke_handler.py", line 34, in <module>
    from litellm.litellm_core_utils.prompt_templates.factory import (
    ...<7 lines>...
    )
  File "/home/noobi/.local/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 2156, in <module>
    from cgi import parse_header
ModuleNotFoundError: No module named 'cgi'

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.60.0

Twitter / LinkedIn details

No response

@Rishang Rishang added the bug Something isn't working label Feb 2, 2025
@MitchelHsu
Copy link

+1, have the same issue here.

@jax0m
Copy link

jax0m commented Feb 3, 2025

Same issue, will run with the solution in the aforementioned stackoverflow link which looks python cgi was removed in 3.13. https://stackoverflow.com/a/78641139

@krrishdholakia
Copy link
Contributor

Hey everyone, cgi import was removed in this PR - #8160

this should be live in v1.60.4+

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants