Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Error type: APIConnectionError #7067

Open
1 task done
GrahamboJangles opened this issue Mar 3, 2025 · 3 comments
Open
1 task done

[Bug]: Error type: APIConnectionError #7067

GrahamboJangles opened this issue Mar 3, 2025 · 3 comments
Labels
bug Something isn't working

Comments

@GrahamboJangles
Copy link

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

RuntimeError: There was an unexpected error while running the agent. Please report this error to the developers. Your session ID is b024e0c130f444dd8851e5f2911643dd. Error type: APIConnectionError

Windows 11 Pro
OpenHands 0.27.0

WSL version: 2.4.11.0
Kernel version: 5.15.167.4-1
WSLg version: 1.0.65
MSRDC version: 1.2.5716
Direct3D version: 1.611.1-81528511
DXCore version: 10.0.26100.1-240331-1435.ge-release
Windows version: 10.0.26100.3194

Docker version 27.5.1, build 9f9e405

OpenHands Installation

Docker command in README

OpenHands Version

main

Operating System

WSL on Windows

Logs, Errors, Screenshots, and Additional Context

I can manually curl the model via terminal but I cannot get local LM Studio API to work with OpenHands.

These are my settings in OpenHands:
Image

Softare versions:
Windows 11 Pro
Docker version 27.5.1, build 9f9e405

WSL version: 2.4.11.0
Kernel version: 5.15.167.4-1
WSLg version: 1.0.65
MSRDC version: 1.2.5716
Direct3D version: 1.611.1-81528511
DXCore version: 10.0.26100.1-240331-1435.ge-release
Windows version: 10.0.26100.3194

@GrahamboJangles GrahamboJangles added the bug Something isn't working label Mar 3, 2025
@enyst
Copy link
Collaborator

enyst commented Mar 3, 2025

Please try using:

  • the prefix "openai/" with the model name, because afaik LMStudio has an openai-compatible API and that's how LiteLLM, which we use to support models, knows how to format the request.
  • port 1234, which is the default for LMStudio, afaik, or what you have configured if you are using LM Studio. 11434 is the default port for Ollama.

@GrahamboJangles
Copy link
Author

Please try using:

  • the prefix "openai/" with the model name, because afaik LMStudio has an openai-compatible API and that's how LiteLLM, which we use to support models, knows how to format the request.
  • port 1234, which is the default for LMStudio, afaik, or what you have configured if you are using LM Studio. 11434 is the default port for Ollama.

@enyst Thank you for your reply.

After applying those changes in OpenHands, I get error:
BadRequestError: litellm.BadRequestError: OpenAIException -

@enyst
Copy link
Collaborator

enyst commented Mar 4, 2025

Can you please tell more about that exception? From the logs, perhaps?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants