Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] lobehub/lobe-chat-database这个镜像有问题ollama配置一直检测不通过一直请求127.0.0.1 #5694

Open
yangjianchun opened this issue Feb 3, 2025 · 8 comments
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models

Comments

@yangjianchun
Copy link

📦 部署环境

Docker

📌 软件版本

v1.49.14

💻 系统环境

Other Linux

🌐 浏览器

Chrome

🐛 问题描述

Image 填写了ollama地址也不行 ollama也是容器启动的肯定是没问提

📷 复现步骤

No response

🚦 期望结果

No response

📝 补充信息

No response

@yangjianchun yangjianchun added the 🐛 Bug Something isn't working | 缺陷 label Feb 3, 2025
@github-project-automation github-project-automation bot moved this to Roadmap - Chat 1.x in Lobe Chat Routine Feb 3, 2025
@lobehubbot
Copy link
Member

👀 @yangjianchun

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@dosubot dosubot bot added the ollama Relative to Ollama Provider and ollama models label Feb 3, 2025
Copy link

dosubot bot commented Feb 3, 2025

@yangjianchun 你好!我是Dosu,我是一个可以帮助你解决bug、回答问题,并且成为贡献者的机器人。在等待人类维护者的过程中,我会尽力帮助你解决问题。

There is a known issue with the lobehub/lobe-chat-database Docker image where it incorrectly makes requests to 127.0.0.1 despite providing the correct Ollama address. This is due to a bug that causes the Ollama URL to remain fixed at localhost:11434. To resolve this, ensure you are using Lobe Chat version 1.49.11 or later, which contains a fix for this issue.

To properly configure the service address, use the following environment variables in your Docker command:

docker run -d -p 3210:3210 \
  -e OLLAMA_PROXY_URL=http://host.docker.internal:11434 \
  -e OLLAMA_HOST=0.0.0.0:11434 \
  -e OLLAMA_ORIGINS=* \
  lobehub/lobe-chat

Additionally, ensure that the DEFAULT_BASE_URL is correctly set to 'http://127.0.0.1:11434' in the code. If you experience CORS issues, configure the OLLAMA_ORIGINS environment variable to allow cross-origin requests by running:

launchctl setenv OLLAMA_ORIGINS "*"

If the issue persists, you can manually set the 'Interface proxy address' to 'http://127.0.0.1:11434' as a workaround [1][2][3].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@arvinxx
Copy link
Contributor

arvinxx commented Feb 3, 2025

你自己先本地pull一下模型呢

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


You are a local model first.

@yangjianchun
Copy link
Author

你自己先本地pull一下模型呢

之前是用lobehub/lobe-chat 这个镜像是可以的 我是为了完整部署用了官方的lobehub/lobe-chat-database这个镜像,你们可以测试一下,ollama就是本地的

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


You first pull first, do you think about the model?

I used to use Lobehub/Lobe-Chat. This image is okay. I used the official Lobehub/Lobe-Chat-DataBase.

@cy948
Copy link
Contributor

cy948 commented Feb 5, 2025

先开客户端请求模式,再请求

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Turn on client request mode first, then request

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

4 participants