We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v1.1.0
否可以自行安裝下載的gguf大模型? 例如我有下載 "Llama3-8B-Chinese-Chat-GGUF-8bit" 對中文的支持很好 但卻無法安裝到 MaxKB 裡面 有點可惜 畢竟 Llama3 使用上雖然還可以 但經常回覆英文 幾使我強調使用中文也是一樣
No response
The text was updated successfully, but these errors were encountered:
@wing0210 可以ollama run wangshenzhi/llama3-8b-chinese-chat-ollama-q8 再透過 maxKB 調用
Sorry, something went wrong.
ollama pull wangshenzhi/llama3-8b-chinese-chat-ollama-q8就可以了
maxkb 添加ollama供应商模型 基础模型使用wangshenzhi/llama3-8b-chinese-chat-ollama-q8 系统就会自动取下载哦
baixin513
No branches or pull requests
MaxKB 版本
v1.1.0
请描述您的需求或者改进建议
否可以自行安裝下載的gguf大模型?
例如我有下載 "Llama3-8B-Chinese-Chat-GGUF-8bit" 對中文的支持很好
但卻無法安裝到 MaxKB 裡面 有點可惜
畢竟 Llama3 使用上雖然還可以 但經常回覆英文
幾使我強調使用中文也是一樣
请描述你建议的实现方案
No response
附加信息
No response
The text was updated successfully, but these errors were encountered: