Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] 是否可以自行安裝下載的gguf大模型? #343

Closed
wing0210 opened this issue Apr 30, 2024 · 3 comments
Closed

[FEATURE] 是否可以自行安裝下載的gguf大模型? #343

wing0210 opened this issue Apr 30, 2024 · 3 comments
Assignees

Comments

@wing0210
Copy link

MaxKB 版本

v1.1.0

请描述您的需求或者改进建议

否可以自行安裝下載的gguf大模型?
例如我有下載 "Llama3-8B-Chinese-Chat-GGUF-8bit" 對中文的支持很好
但卻無法安裝到 MaxKB 裡面 有點可惜
畢竟 Llama3 使用上雖然還可以 但經常回覆英文
幾使我強調使用中文也是一樣

请描述你建议的实现方案

No response

附加信息

No response

@tbdavid2019
Copy link

@wing0210
可以ollama run wangshenzhi/llama3-8b-chinese-chat-ollama-q8
再透過 maxKB 調用

@learnin9
Copy link

learnin9 commented May 5, 2024

ollama pull wangshenzhi/llama3-8b-chinese-chat-ollama-q8就可以了

@shaohuzhang1
Copy link
Collaborator

maxkb 添加ollama供应商模型 基础模型使用wangshenzhi/llama3-8b-chinese-chat-ollama-q8 系统就会自动取下载哦

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants