Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: https://nextjs-ollama-xi.vercel.app/ is not responding. #47

Closed
kevin-zkc opened this issue May 14, 2024 · 6 comments
Closed

Bug: https://nextjs-ollama-xi.vercel.app/ is not responding. #47

kevin-zkc opened this issue May 14, 2024 · 6 comments

Comments

@kevin-zkc
Copy link

Description

When I try to ask a question on the web hosted version, an error occurs and I don't get a response.

Screenshot

image
I have Ollama installed and running on my local machine and I have pulled the Gemma:2b model.

@jakobhoeg
Copy link
Owner

Hey, @kevin-zkc.

Apologize for the late response. Have you set your OLLAMA_ORIGINS as instructed here?

@Shapur1234
Copy link

Shapur1234 commented Jun 3, 2024

I have a similar issue. The website seems to error out in the same way for me.

I installed nextjs-ollama-llm-ui and ollama through nix.
ollama is running (on the default adress):

$ http://localhost:11434
Ollama is running⏎

I then ran

$ nextjs-ollama-llm-ui
  ▲ Next.js 14.2.3
  - Local:        http://jirka-nixos:3000
  - Network:      http://127.0.0.2:3000

 ✓ Starting...
 ✓ Ready in 33ms
 ⨯ Failed to write image to cache B0Z3FnWKAAGjZHJa171iie+KPfLVSPJajqE+M39emDk= [Error: ENOENT: no such file or directory, mkdir '/nix/store/cjaly7lliqp1i0bb1mjd6m0rsah96w9b-nextjs-ollama-llm-ui-1.0.1/share/homepage/.next/cache'] {
  errno: -2,
  code: 'ENOENT',
  syscall: 'mkdir',
  path: '/nix/store/cjaly7lliqp1i0bb1mjd6m0rsah96w9b-nextjs-ollama-llm-ui-1.0.1/share/homepage/.next/cache'
}

Upon loading the page, (the terminal) errors with some cache error, but I don't know if that is relevant.

I have tried to load the page (http://127.0.0.2:3000/) in both librewolf and chromium. Inspecting the console, both seem to error out regarding CORS policy. Perhaps Cross Origin Isolation is set incorrectly on the http server, causing the errors?

# Librewolf
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://127.0.0.1:11434/api/tags. (Reason: CORS header ‘Access-Control-Allow-Origin’ missing). Status code: 403.

Uncaught (in promise) TypeError: NetworkError when attempting to fetch resource. 
# Chromium
Access to fetch at 'http://127.0.0.1:11434/api/tags' from origin 'http://127.0.0.2:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
127.0.0.1:11434/api/tags:1 
        
       Failed to load resource: net::ERR_FAILED
2834-5199a8214b3bedf0.js:1 
        
       Uncaught (in promise) TypeError: Failed to fetch
    at 2834-5199a8214b3bedf0.js:1:18891
    at 2834-5199a8214b3bedf0.js:1:18981
    at aW (fd9d1056-89ea21c682989a8a.js:1:73244)
    at oe (fd9d1056-89ea21c682989a8a.js:1:84685)
    at ol (fd9d1056-89ea21c682989a8a.js:1:85323)
    at or (fd9d1056-89ea21c682989a8a.js:1:85207)
    at ol (fd9d1056-89ea21c682989a8a.js:1:86268)
    at or (fd9d1056-89ea21c682989a8a.js:1:85207)
    at ol (fd9d1056-89ea21c682989a8a.js:1:85303)
    at or (fd9d1056-89ea21c682989a8a.js:1:85207)
(anonymous) @ 2834-5199a8214b3bedf0.js:1
(anonymous) @ 2834-5199a8214b3bedf0.js:1
aW @ fd9d1056-89ea21c682989a8a.js:1
oe @ fd9d1056-89ea21c682989a8a.js:1
ol @ fd9d1056-89ea21c682989a8a.js:1
or @ fd9d1056-89ea21c682989a8a.js:1
...
ol @ fd9d1056-89ea21c682989a8a.js:1
id @ fd9d1056-89ea21c682989a8a.js:1
nb @ fd9d1056-89ea21c682989a8a.js:1
(anonymous) @ fd9d1056-89ea21c682989a8a.js:1
is @ fd9d1056-89ea21c682989a8a.js:1
o1 @ fd9d1056-89ea21c682989a8a.js:1
oZ @ fd9d1056-89ea21c682989a8a.js:1
T @ 7023-622cec402430372f.js:1
127.0.0.2/:1 Access to fetch at 'http://127.0.0.1:11434/api/chat' from origin 'http://127.0.0.2:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
127.0.0.1:11434/api/chat:1 
        
        
       Failed to load resource: net::ERR_FAILED

P. S.:

I have also tried the oterm ollama client, it is working, so ollama should be set up correctly on my machine.

@jakobhoeg
Copy link
Owner

jakobhoeg commented Jun 3, 2024

Hey, @Shapur1234.

The CORS error is probably due to your frontend being on http://127.0.0.2:3000 and not http://127.0.0.1:3000.
As stated in the Ollama documentation: Ollama allows cross-origin requests from 127.0.0.1 and 0.0.0.0 by default. Additional origins can be configured with OLLAMA_ORIGINS.

Can you try setting this: OLLAMA_ORIGINS=http://127.0.0.1 or OLLAMA_ORIGINS=*
Link to docs

@Shapur1234
Copy link

Setting OLLAMA_ORIGINS=* fixed it.
Thanks!

@Shapur1234
Copy link

Maybe it would be worthwhile to mention an explanation on setting up OLLAMA_ORIGINS=* for "local" use in the readme.

Since OLLAMA_ORIGINS is (in the readme) only mentioned under Vercel or Netlify, I thought it was specific to those and didn't apply for my situation.

@jakobhoeg
Copy link
Owner

jakobhoeg commented Jun 3, 2024

Yes, you're correct.
The only reason I didn't include it there is because the default local environment is usually http://127.0.01.

But I will update the readme, thanks for the suggestion 😄

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants