Docker image with Ollama installed very slow #2309
PietFourie
started this conversation in
General
Replies: 2 comments
-
Here is the code generated |
Beta Was this translation helpful? Give feedback.
0 replies
-
GPU memory used in the combined version was lower than in the separate Ollama docker images. It also used CPU cores. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi
My system has a NVDIA 1070 8 GB GPU card running linux mint
I used the latest Docker images as installed on the 15th May
The prompt was :
Write me a python code for the game centipede
Model LLama 3:lastest
The error is that the combined Ollama/OpenWebUI docker miage creates carbage and is a lot slower.
Error with ModelFilles
Also I coulld not import the modelfiles from the website. It just does nothing. Also when you import the model it looks for ".json" files but the downloaded modelfile have a ".txt" extension. I would presume the error is significant. All model files are on the recommended home page of the website and should be compatible.
Beta Was this translation helpful? Give feedback.
All reactions