Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Ollama连通性检查失败 #2520

Open
anrgct opened this issue May 15, 2024 · 6 comments
Open

[Bug] Ollama连通性检查失败 #2520

anrgct opened this issue May 15, 2024 · 6 comments
Labels
🐛 Bug Something isn't working | 缺陷

Comments

@anrgct
Copy link

anrgct commented May 15, 2024

💻 系统环境

macOS

📦 部署环境

Docker

🌐 浏览器

Chrome

🐛 问题描述

1715781855586
Ollama连通性检查失败,我通过配置docker中lobe访问主机的ollama服务,关闭了“使用客户端请求模式”,但是“连通性检查”仍然是通过浏览器直接发出的,虽然在聊天界面可以正常使用了,但是这个提示是错误的。

🚦 期望结果

No response

📷 复现步骤

No response

📝 补充信息

No response

@anrgct anrgct added the 🐛 Bug Something isn't working | 缺陷 label May 15, 2024
@lobehubbot
Copy link
Member

👀 @anrgct

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


💻 System environment

macOS

📦 Deployment environment

Docker

🌐 Browser

Chrome

🐛 Problem description

1715781855586
The Ollama connectivity check failed. I accessed the host's ollama service by configuring lobe in docker and turned off "Use client request mode", but the "connectivity check" was still issued directly through the browser, although it could be used normally in the chat interface. , but this prompt is wrong.

🚦 Expected results

No response

📷 Steps to reproduce

No response

📝 Supplementary information

No response

@rennai
Copy link

rennai commented May 19, 2024

启动 ollama 需要设置环境变量:
OLLAMA_ORIGINS=* ollama serve

OLLAMA_ORIGINS=* ollama run llama3

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Starting ollama requires setting environment variables:
OLLAMA_ORIGINS=* ollama serve
or
OLLAMA_ORIGINS=* ollama run llama3

@anrgct
Copy link
Author

anrgct commented May 19, 2024

https访问http有mixed-content错误,“使用客户端请求模式”并不控制“连通性检查”。我在Windows控制面板-系统属性-环境变量-用户环境变量新建变量名"OLLAMA_HOST"变量值"0.0.0.0",变量名"OLLAMA_ORIGINS"变量值"*",设置成功后,沉浸式翻译都接入ollama成功了,但是lobe-chat中“连通性检查”始终失败,因为我是https的本地域名启动的lobe-chat,这个状态下“使用客户端请求模式”关闭后可以正常聊天使用。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


There is a mixed-content error when accessing http from https. "Use client request mode" does not control "connectivity check". I created a new variable name "OLLAMA_HOST" variable value "0.0.0.0" in the Windows Control Panel - System Properties - Environment Variables - User Environment Variables, and a variable name "OLLAMA_ORIGINS" variable value "*". After the settings are successful, Immersive Translation is connected. Ollama succeeded, but the "connectivity check" in lobe-chat always failed, because I started lobe-chat with the local domain name of https. In this state, after "Use client request mode" is turned off, the chat can be used normally.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

3 participants