WebUI homepage hangs for 20+ minutes after docker-compose down and back up #2111
sebsquire-rowden
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Bug Report
Description
WebUI hangs for 20+ minutes after reloading docker compose on a remote ssh server
Bug Summary:
Running webui and ollama works fine on an ssh server, I can connect through http://ssh_server_ip:3010 and the auth page is brought up. However when I docker compose down (to make changes) then back up, the request hangs and can take over 20 minutes to show. If I restart my laptop (not the ssh server) it works fine. How can I get it to run every time?
Steps to Reproduce:
docker-compose up using this file: https://gist.github.com/sebsquire-rowden/3b2535319df658a951985251daec8a06
#log in as normal to test
docker-compose down
docker-compose up
#request to http://ssh_server_ip:3010 hangs
Expected Behavior:
Request does not hang
Actual Behavior:
Request hangs
Environment
Ubuntu 20.04 laptop, Ubuntu 22.04 server running:
docker version 26.1.1
docker-compose version v2.24.7
Open WebUI Version: 0.1.124
Browser (if applicable): Tested on chrome and firefox
Reproduction Details
Confirmation:
Logs and Screenshots
Docker Container Logs:
ollama-webui | No WEBUI_SECRET_KEY provided
ollama-webui | Generating WEBUI_SECRET_KEY
ollama-webui | Loading WEBUI_SECRET_KEY from .webui_secret_key
ollama-server-1 | 2024/05/08 15:51:37 routes.go:989: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR: OLLAMA_TMPDIR:]"
ollama-server-1 | time=2024-05-08T15:51:37.569Z level=INFO source=images.go:897 msg="total blobs: 8"
ollama-server-1 | time=2024-05-08T15:51:37.569Z level=INFO source=images.go:904 msg="total unused blobs removed: 0"
ollama-server-1 | time=2024-05-08T15:51:37.569Z level=INFO source=routes.go:1034 msg="Listening on [::]:11434 (version 0.1.34)"
ollama-server-1 | time=2024-05-08T15:51:37.569Z level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama901894506/runners
ollama-server-1 | time=2024-05-08T15:51:40.664Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11 rocm_v60002]"
ollama-server-1 | time=2024-05-08T15:51:40.664Z level=INFO source=gpu.go:122 msg="Detecting GPUs"
ollama-server-1 | time=2024-05-08T15:51:40.675Z level=INFO source=gpu.go:127 msg="detected GPUs" count=1 library=/usr/lib/x86_64-linux-gnu/libcuda.so.535.54.03
ollama-server-1 | time=2024-05-08T15:51:40.675Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
ollama-webui | INFO: Started server process [1]
ollama-webui | INFO: Waiting for application startup.
ollama-webui | INFO: Application startup complete.
ollama-webui | INFO: Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
ollama-webui | INFO: 100.65.122.110:46568 - "GET /_app/immutable/nodes/14.b1fb0248.js HTTP/1.1" 200 OK
ollama-webui | INFO: 100.65.122.110:44502 - "GET /api/v1/ HTTP/1.1" 200 OK
ollama-webui | INFO: 100.65.122.110:50186 - "GET / HTTP/1.1" 200 OK
Beta Was this translation helpful? Give feedback.
All reactions