How to add the llama3 model on the dify online platform #4040
Unanswered
tom-leader
asked this question in
Feedbacks
Replies: 1 comment
-
Hi, @tom-leader I guess you need to read this document in detail. It details how to integrate ollama. The difference is that the address of the API needs to be the IP address of the server where your ollama is running. If ollama and dify are not running on the same server. It is worth noting that you need to expose the ollama service to the network, please check this section: https://docs.dify.ai/tutorials/model-configuration/ollama#how-can-i-expose-ollama-on-my-network |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How to add the llama3 model on the dify online platform?
Beta Was this translation helpful? Give feedback.
All reactions