-
-
Notifications
You must be signed in to change notification settings - Fork 12.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Ollama 服务地址有问题,永远指向 http://127.0.0.1:11434 #5409
Comments
👀 @llkllk Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
@llkllk 你好!我是Dosu,一个可以帮助你解决 bug、回答问题,并成为贡献者的机器人。我会在等待人类维护者的过程中帮助你。 The issue you're experiencing with the Ollama service always pointing to For Docker, you can set these environment variables using the docker run -d -p 3210:3210 \
-e OLLAMA_PROXY_URL=http://host.docker.internal:11434 \
-e OLLAMA_HOST=0.0.0.0:11434 \
-e OLLAMA_ORIGINS=* \
lobehub/lobe-chat This configuration sets To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
The issue you're experiencing with the Ollama service always pointing to For example, you can set these environment variables within a Docker container using the following command: docker run -d -p 3210:3210 -e OLLAMA_HOST=0.0.0.0 -e OLLAMA_ORIGINS=* lobehub/lobe-chat This will make the service accessible from any machine and avoid CORS errors [1][2]. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
docker image |
我也是测试Ollama is running可以 但在lobechat中检查连接时候报错Show Details |
@1341797533syh-cell 如果你是service跑的ollama需要配置
|
@1341797533syh-cell If you run ollama as a service, you need to configure it
|
我用的是docker 通过docker run -d --gpus=all -v ollama:/root/.ollama -e OLLAMA_ORIGINS="*" -e OLLAMA_HOST=0.0.0.0 -p 11434:11434 --name ollama ollama/ollama 运行的 请问这样不行吗 |
I use docker through docker run -d --gpus=all -v ollama:/root/.ollama -e OLLAMA_ORIGINS="*" -e OLLAMA_HOST=0.0.0.0 -p 11434:11434 --name ollama ollama/ollama Is it possible to run it this way? |
using latest image still not working:
|
I have the same problem too |
I have the same problem too +1 |
1 similar comment
I have the same problem too +1 |
same issues,Whether I add OLLAMA_PROXY_URL in the .env file or in the docker-compose, it always points to 127.0.0.1. |
I have the same problem too +1 |
I have the same problem too +1 |
ollama在 docker 版的 lobe-chat 可以用 , lobe-chat-database 不能用 |
ollama can be used in the docker version of lobe-chat, but cannot be used in lobe-chat-database |
目前这里改的是一团槽。。。最新的 lobe-chat 也不行了。。。只有一个 lobehubbot 在傻傻的回复,但是解决不了任何问题。。。对于明确是 bug,而不是 question 的 issue,没必要弄这个宛如智障一般的机器人 |
What is currently being modified here is a bunch of slots. . . The latest lobe-chat doesn’t work either. . . There is only one lobehubot replying stupidly, but it can't solve any problems. . . For issues that are clearly bugs, not questions, there is no need to create this retarded robot. |
Having the same issue. |
Same here. |
I have embedding issue with Ollama from #5583. Not sure if this is the root cause. Can someone diagnose and help? Thanks. |
我也存在问题,我是用的是在亚马逊部署的ollama qwen2.5,可以正常使用,但是lobechat不识别 |
I also have problems. I use Ollama Qwen2.5 deployed in Amazon. It can be used normally, but Lobechat does not identify |
有没有人试过降低版本可以吗 |
Has anyone tried to reduce the version? |
✅ @llkllk This issue is closed, If you have any questions, you can comment and reply. |
🎉 This issue has been resolved in version 1.49.11 🎉 The release is available on: Your semantic-release bot 📦🚀 |
📦 Environment
Docker
📌 Version
1.45.6
💻 Operating System
Ubuntu
🌐 Browser
Chrome
🐛 Bug Description
Ollama 服务地址无论写什么,连通性检查报错的host都是http://127.0.0.1:11434
📷 Recurrence Steps
No response
🚦 Expected Behavior
No response
📝 Additional Information
No response
The text was updated successfully, but these errors were encountered: