Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: JSON Decode Error with Deepseek Model in OpenHands #7070

Open
1 task done
jatinkrmalik opened this issue Mar 3, 2025 · 1 comment
Open
1 task done

[Bug]: JSON Decode Error with Deepseek Model in OpenHands #7070

jatinkrmalik opened this issue Mar 3, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@jatinkrmalik
Copy link

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

When running OpenHands, I'm encountering a JSON decode error with the Deepseek model. The agent fails to run with the following error: litellm.APIError: APIError: DeepseekException - Unable to get json response - Expecting value: line 1 column 1 (char 0), Original Response:.

The error occurs during the agent execution process, specifically when the CodeActAgent attempts to get a completion from the LLM. The JSON parser is unable to parse the empty response returned from the Deepseek API.

OpenHands Installation

Docker command in README

OpenHands Version

0.27.0

Operating System

WSL on Windows

Logs, Errors, Screenshots, and Additional Context

The error occurs in the agent controller when trying to run the agent. The key part of the error is:

litellm.APIError: APIError: DeepseekException - Unable to get json response - Expecting value: line 1 column 1 (char 0), Original Response:

This suggests that the Deepseek API is returning an empty response that can't be parsed as JSON. The error propagates through several layers:

  1. JSON decode error in litellm's response parsing
  2. Handled by litellm's exception mapping
  3. Bubbles up to the agent controller
  4. Agent state changes from RUNNING to ERROR

Full error log is available showing the complete traceback through the various components here:

06:14:40 - openhands:ERROR: agent_controller.py:242 - [Agent Controller fb308a4b857f4a8bb8e21757b6307641] Error while running the agent (session ID: fb308a4b857f4a8bb8e21757b6307641): litellm.APIError: APIError: DeepseekException - Unable to get json response - Expecting value: line 1 column 1 (char 0), Original Response: . Traceback: Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/chat/gpt_transformation.py", line 235, in transform_response
    completion_response = raw_response.json()
                          ^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/httpx/_models.py", line 832, in json
    return jsonlib.loads(self.content, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 1404, in completion
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 1379, in completion
    response = base_llm_http_handler.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 372, in completion
    return provider_config.transform_response(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/chat/gpt_transformation.py", line 238, in transform_response
    raise OpenAIError(
litellm.llms.openai.common_utils.OpenAIError: Unable to get json response - Expecting value: line 1 column 1 (char 0), Original Response:

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/openhands/controller/agent_controller.py", line 240, in _step_with_exception_handling
    await self._step()
  File "/app/openhands/controller/agent_controller.py", line 677, in _step
    action = self.agent.step(self.state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/openhands/agenthub/codeact_agent/codeact_agent.py", line 130, in step
    response = self.llm.completion(**params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
    return copy(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
    result = action(retry_state)
             ^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 398, in <lambda>
    self._add_action_func(lambda rs: rs.outcome.result())
                                     ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/app/openhands/llm/llm.py", line 242, in wrapper
    resp: ModelResponse = self._completion_unwrapped(*args, **kwargs)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1190, in wrapper
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1068, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 3085, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2202, in exception_type
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 452, in exception_type
    raise APIError(
litellm.exceptions.APIError: litellm.APIError: APIError: DeepseekException - Unable to get json response - Expecting value: line 1 column 1 (char 0), Original Response:

06:14:40 - openhands:INFO: agent_controller.py:466 - [Agent Controller fb308a4b857f4a8bb8e21757b6307641] Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.ERROR
06:14:40 - openhands:INFO: agent_controller.py:466 - [Agent Controller fb308a4b857f4a8bb8e21757b6307641] Setting agent(CodeActAgent) state from AgentState.ERROR to AgentState.ERROR
06:14:40 - OBSERVATION
[Agent Controller fb308a4b857f4a8bb8e21757b6307641] AgentStateChangedObservation(content='', agent_state='running', observation='agent_state_changed')
06:14:40 - OBSERVATION
[Agent Controller fb308a4b857f4a8bb8e21757b6307641] AgentStateChangedObservation(content='', agent_state='error', observation='agent_state_changed')

Here's my config:

Image

@jatinkrmalik jatinkrmalik added the bug Something isn't working label Mar 3, 2025
@jatinkrmalik
Copy link
Author

Update: On retry, I saw the following error in chat:

Image

RuntimeError: There was an unexpected error while running the agent. Please report this error to the developers. Your session ID is fb308a4b857f4a8bb8e21757b6307641. Error type: APIError

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant