Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Ollama 服务地址有问题,永远指向 http://127.0.0.1:11434 #5409

Closed
llkllk opened this issue Jan 13, 2025 · 38 comments · Fixed by #5665
Closed

[Bug] Ollama 服务地址有问题,永远指向 http://127.0.0.1:11434 #5409

llkllk opened this issue Jan 13, 2025 · 38 comments · Fixed by #5665
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models released

Comments

@llkllk
Copy link

llkllk commented Jan 13, 2025

📦 Environment

Docker

📌 Version

1.45.6

💻 Operating System

Ubuntu

🌐 Browser

Chrome

🐛 Bug Description

screenshot-1736729737148
Ollama 服务地址无论写什么,连通性检查报错的host都是http://127.0.0.1:11434

📷 Recurrence Steps

No response

🚦 Expected Behavior

No response

📝 Additional Information

No response

@llkllk llkllk added the 🐛 Bug Something isn't working | 缺陷 label Jan 13, 2025
@github-project-automation github-project-automation bot moved this to Roadmap - Chat 1.x in Lobe Chat Routine Jan 13, 2025
@lobehubbot
Copy link
Member

👀 @llkllk

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

Copy link

dosubot bot commented Jan 13, 2025

@llkllk 你好!我是Dosu,一个可以帮助你解决 bug、回答问题,并成为贡献者的机器人。我会在等待人类维护者的过程中帮助你。

The issue you're experiencing with the Ollama service always pointing to http://127.0.0.1:11434 is due to the DEFAULT_BASE_URL being set to this address in the code. If no base URL is specified in the configuration, it defaults to this local address. To resolve this, you can adjust the environment variables OLLAMA_HOST and OLLAMA_ORIGINS to allow access from non-local locations.

For Docker, you can set these environment variables using the -e flag in your docker run command. Here's an example:

docker run -d -p 3210:3210 \
  -e OLLAMA_PROXY_URL=http://host.docker.internal:11434 \
  -e OLLAMA_HOST=0.0.0.0:11434 \
  -e OLLAMA_ORIGINS=* \
  lobehub/lobe-chat

This configuration sets OLLAMA_HOST to 0.0.0.0:11434, making the service accessible from any machine, and OLLAMA_ORIGINS to *, allowing for cross-origin requests [1][2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added ollama Relative to Ollama Provider and ollama models labels Jan 13, 2025
Copy link

dosubot bot commented Jan 13, 2025

The issue you're experiencing with the Ollama service always pointing to http://127.0.0.1:11434 is due to the DEFAULT_BASE_URL being set to this address in the code. If no base URL is specified in the configuration, it defaults to this local address. To resolve this, you can adjust the environment variables OLLAMA_HOST and OLLAMA_ORIGINS to allow access from non-local locations.

For example, you can set these environment variables within a Docker container using the following command:

docker run -d -p 3210:3210 -e OLLAMA_HOST=0.0.0.0 -e OLLAMA_ORIGINS=* lobehub/lobe-chat

This will make the service accessible from any machine and avoid CORS errors [1][2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@elvizlai
Copy link

I have same issue.

image

@ysicing
Copy link

ysicing commented Jan 14, 2025

image
我这边测试好想没问题

@elvizlai
Copy link

docker image lobehub/lobe-chat works well, but NOT on lobehub/lobe-chat-database

@1341797533syh-cell
Copy link

我也是测试Ollama is running可以 但在lobechat中检查连接时候报错Show Details
json
{
"host": "http://127.0.0.1:11434",
"message": "please check whether your ollama service is available or set the CORS rules",
"provider": "ollama"
}

@ysicing
Copy link

ysicing commented Jan 14, 2025

@1341797533syh-cell 如果你是service跑的ollama需要配置

# /etc/systemd/system/ollama.service.d/override.conf
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


@1341797533syh-cell If you run ollama as a service, you need to configure it

# /etc/systemd/system/ollama.service.d/override.conf
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"

@1341797533syh-cell
Copy link

@1341797533syh-cell 如果你是service跑的ollama需要配置

# /etc/systemd/system/ollama.service.d/override.conf
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"

我用的是docker 通过docker run -d --gpus=all -v ollama:/root/.ollama -e OLLAMA_ORIGINS="*" -e OLLAMA_HOST=0.0.0.0 -p 11434:11434 --name ollama ollama/ollama 运行的 请问这样不行吗

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


@1341797533syh-cell If you run ollama as a service, you need to configure it

# /etc/systemd/system/ollama.service.d/override.conf
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"

I use docker through docker run -d --gpus=all -v ollama:/root/.ollama -e OLLAMA_ORIGINS="*" -e OLLAMA_HOST=0.0.0.0 -p 11434:11434 --name ollama ollama/ollama Is it possible to run it this way?

@aibeishu
Copy link

配置也没有用,我升级后,版本:v1.45.7,就联不通了,检查模型都能检测到,但是依然报错
image

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


The configuration is of no use. After I upgraded to version: v1.45.7, I can no longer connect. I can detect it by checking the model, but I still get an error.
image

@elvizlai
Copy link

using latest image still not working:

lobehub/lobe-chat-database   latest                         b0528444e3c1   7 hours ago   395MB

@DoctorDeng
Copy link

I have the same problem too

@Iceber
Copy link

Iceber commented Jan 15, 2025

I have the same problem too +1

1 similar comment
@Interstellar2
Copy link

I have the same problem too +1

@bmyhelpcode
Copy link

使用最新的图像仍然不起作用:

lobehub/lobe-chat-database   latest                         b0528444e3c1   7 hours ago   395MB

Image
我用的也是最新的 lobehub/lobe-chat-database,设置各种地址都不生效!心累。。。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Using the latest image still doesn't work:

lobehub/lobe-chat-database latest b0528444e3c1 7 hours ago 395MB

Image
I am also using the latest lobehub/lobe-chat-database, and setting various addresses does not take effect! Tired. . .

@QianJue-CN
Copy link

same issues,Whether I add OLLAMA_PROXY_URL in the .env file or in the docker-compose, it always points to 127.0.0.1.

@Lmn001
Copy link

Lmn001 commented Jan 22, 2025

I have the same problem too +1
lobehub/lobe-chat-database

@chen569756
Copy link

I have the same problem too +1

@kangli
Copy link

kangli commented Jan 23, 2025

ollama在 docker 版的 lobe-chat 可以用 , lobe-chat-database 不能用

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


ollama can be used in the docker version of lobe-chat, but cannot be used in lobe-chat-database

@elvizlai
Copy link

elvizlai commented Jan 23, 2025

目前这里改的是一团槽。。。最新的 lobe-chat 也不行了。。。只有一个 lobehubbot 在傻傻的回复,但是解决不了任何问题。。。对于明确是 bug,而不是 question 的 issue,没必要弄这个宛如智障一般的机器人

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


What is currently being modified here is a bunch of slots. . . The latest lobe-chat doesn’t work either. . . There is only one lobehubot replying stupidly, but it can't solve any problems. . . For issues that are clearly bugs, not questions, there is no need to create this retarded robot.

@rukh-debug
Copy link

Having the same issue.

@yaleh
Copy link
Contributor

yaleh commented Jan 24, 2025

Same here. lobehub/lobe-chat-database

@Interstellar2
Copy link

I find a way to fix it! If you are using docker to deploy lobe-chat, i think that may help.Change the ollama address in src/services/ollama.ts

Image

then build your own image and deploy it,then it should request to your own address.

@J4gQBqqR
Copy link

I have embedding issue with Ollama from #5583. Not sure if this is the root cause. Can someone diagnose and help? Thanks.

@yaleh
Copy link
Contributor

yaleh commented Jan 26, 2025

I find a way to fix it! If you are using docker to deploy lobe-chat, i think that may help.Change the ollama address in src/services/ollama.ts

Image

then build your own image and deploy it,then it should request to your own address.

I saw it also. But it's really dirty. 😞

@Mqlhaha
Copy link

Mqlhaha commented Jan 27, 2025

It seems that this issue is only related to some codes. I tried to test connection and the same error shows:

Image

However when I tried to sync the model list, it get the result successfully:

Image

Also chatting using ollama works fine for me:

Image

@dentistfrankchen
Copy link

我也存在问题,我是用的是在亚马逊部署的ollama qwen2.5,可以正常使用,但是lobechat不识别

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


I also have problems. I use Ollama Qwen2.5 deployed in Amazon. It can be used normally, but Lobechat does not identify

@dentistfrankchen
Copy link

有没有人试过降低版本可以吗

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Has anyone tried to reduce the version?

@lobehubbot
Copy link
Member

@llkllk

This issue is closed, If you have any questions, you can comment and reply.
此问题已经关闭。如果您有任何问题,可以留言并回复。

@lobehubbot
Copy link
Member

🎉 This issue has been resolved in version 1.49.11 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models released
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.