Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot connect to ollama server on remote #1145

Open
3 tasks done
pmatos opened this issue Apr 17, 2024 · 18 comments
Open
3 tasks done

Cannot connect to ollama server on remote #1145

pmatos opened this issue Apr 17, 2024 · 18 comments
Assignees
Labels
bug Something isn't working

Comments

@pmatos
Copy link

pmatos commented Apr 17, 2024

Before submitting your bug report

Relevant environment info

- OS: Archlinux
- Continue: 0.8.24
- IDE: vscode 1.88.1
- Model: Ollama 0.1.31

Description

I have a laptop ("butterfly"), where ollama serve is installed and models are pulled, but my project is remote. I configured tab completions with ollama as suggested (config in butterfly):

  "tabAutocompleteModel": {
    "title": "Tab Autocomplete Model",
    "provider": "ollama",
    "model": "starcoder2:3b",
    "apiBase": "http://butterfly:11434",
    "num_thread": 1
  },

and yet, I get failures connecting to http://butterfly:11434 - which I assume is happening from the remote since that's where the code lives. But from the remote this works:

curl http://butterfly:11434/api/generate -d '{ "model": "starcoder2:3b", "prompt": "if x ==" }'             
{"model":"starcoder2:3b","created_at":"2024-04-17T12:26:12.424072856Z","response":" ","done":false}
{"model":"starcoder2:3b","created_at":"2024-04-17T12:26:12.439451787Z","response":"3","done":false}
...

Unfortunately I haven't found a way to see debug logs for continue.

To reproduce

No response

Log output

console.ts:137 [Extension Host] Error generating autocompletion:  FetchError: request to http://butterfly:11434/api/generate failed, reason: connect EHOSTUNREACH 192.168.178.71:11434
	at ClientRequest.<anonymous> (/home/pmatos/.vscode-server/extensions/continue.continue-0.8.24-linux-arm64/out/extension.js:25975:14)
	at ClientRequest.emit (node:events:529:35)
	at Socket.socketErrorListener (node:_http_client:501:9)
	at Socket.emit (node:events:517:28)
	at emitErrorNT (node:internal/streams/destroy:151:8)
	at emitErrorCloseNT (node:internal/streams/destroy:116:3)
	at process.processTicksAndRejections (node:internal/process/task_queues:82:21)
@pmatos pmatos added the bug Something isn't working label Apr 17, 2024
@ostapiuk
Copy link

I'm having a very similar problem, but with IntelliJ: #1136.

@readmodifywrite
Copy link

+1

1 similar comment
@gokulkgm
Copy link

+1

@sestinj
Copy link
Contributor

sestinj commented Apr 26, 2024

which I assume is happening from the remote since that's where the code lives. But from the remote this works:

@pmatos I want to make sure I understand what you're referring to as the remote. Does this mean that you are using Remote SSH in VS Code? If so, Continue runs by default as a UI extension, so it runs on your laptop rather than in the remote with the code. So in this case you do not need to connect to butterfly, but just to localhost.

@sestinj
Copy link
Contributor

sestinj commented Apr 26, 2024

@readmodifywrite @gokulkgm Can you share more about what exactly your setups look like? Given that I wasn't perfectly clear on the original issue's set up I want to make sure there aren't different details about your situations. Looking to debug this as soon as I have more information!

@sestinj sestinj self-assigned this Apr 26, 2024
@gokulkgm
Copy link

@sestinj I had an issue connecting to ollama host (forgot what was the exact error).

Then on an another issue it was mentioned to use pre-release version of the extension. With that I was able to connect to ollama api.

@pmatos
Copy link
Author

pmatos commented Apr 26, 2024

which I assume is happening from the remote since that's where the code lives. But from the remote this works:

@pmatos I want to make sure I understand what you're referring to as the remote. Does this mean that you are using Remote SSH in VS Code? If so, Continue runs by default as a UI extension, so it runs on your laptop rather than in the remote with the code. So in this case you do not need to connect to butterfly, but just to localhost.

I see - that was not my understanding. My understanding was that when you're in a remote project, extensions are installed on the remote.

@xndpxs
Copy link

xndpxs commented May 1, 2024

+1

@ahoplock
Copy link

ahoplock commented May 9, 2024

0.8.27 fixed this for me...previously anything newer than 0.8.23 would not work as I outlined here -> #1215 (comment)

@sestinj
Copy link
Contributor

sestinj commented May 9, 2024

@xndpxs @pmatos is this solved now for y'all in 0.8.27 as for ahoplock?

Also just for reference of what I was mentioning about running on local vs remote: https://code.visualstudio.com/api/advanced-topics/remote-extensions

This honestly is a point of discussion right now, so if you have strong feelings about where the extension ought to run I'm open to hearing! Primary concern with moving to remote is that if you wanted to run an LLM locally, you'd have to go through extra trouble to expose your machine's localhost to the remote server

@xndpxs
Copy link

xndpxs commented May 10, 2024

@sestinj Yes, In my case it was a ollama variable problem.
I looked with
OLLAMA_SERVER=ip:port ollama list
And found just 1 model, so I procced to install the other like this:
OLLAMA_SERVER=ip:port ollama install starcoder2
And it worked

@pmatos
Copy link
Author

pmatos commented May 11, 2024

@xndpxs @pmatos is this solved now for y'all in 0.8.27 as for ahoplock?

Also just for reference of what I was mentioning about running on local vs remote: https://code.visualstudio.com/api/advanced-topics/remote-extensions

This honestly is a point of discussion right now, so if you have strong feelings about where the extension ought to run I'm open to hearing! Primary concern with moving to remote is that if you wanted to run an LLM locally, you'd have to go through extra trouble to expose your machine's localhost to the remote server

It is still not working here.
It feels like it's trying to run the extension remotely. Because I have for sure local ollama running and with:

  "tabAutocompleteModel": {
    "title": "Tab Autocomplete Model - Starcoder2:3b",
    "provider": "ollama",
    "model": "starcoder2:3b",
    "apiBase": "https://localhost:11434"
  },
  "tabAutocompleteOptions": {
    "useSuffix": true,
    "useCache": true,
    "multilineCompletions": "never",
    "useOtherFiles": true
  },

I keep getting:

FetchError:request to https://127.0.0.1:11434/api/generate failed, reason: connect ECONNREFUSED

If I try a curl request on the command line, it works:

~ curl http://localhost:11434/api/generate -d '{
  "model": "llama3",
  "prompt": "Why is the sky blue?"
}'
{"model":"llama3","created_at":"2024-05-11T06:58:04.886581958Z","response":"What","done":false}
{"model":"llama3","created_at":"2024-05-11T06:58:04.917646086Z","response":" a","done":false}
{"model":"llama3","created_at":"2024-05-11T06:58:04.948962271Z","response":" great","done":false}
{"model":"llama3","created_at":"2024-05-11T06:58:04.98032007Z","response":" question","done":false}
...

@pmatos
Copy link
Author

pmatos commented May 13, 2024

Here's a screenshot of what I see as a problem:

Screenshot from 2024-05-13 16-58-59

If I don't even setup the apibase, i.e. localhost:11343 for ollama, it offers me to download ollama. But I don't need that because it's already running as you can see in the terminal. It feels there's some confusion between what's running locally and remotely.

@pmatos
Copy link
Author

pmatos commented May 13, 2024

which I assume is happening from the remote since that's where the code lives. But from the remote this works:

@pmatos I want to make sure I understand what you're referring to as the remote. Does this mean that you are using Remote SSH in VS Code? If so, Continue runs by default as a UI extension, so it runs on your laptop rather than in the remote with the code. So in this case you do not need to connect to butterfly, but just to localhost.

I don't understand how this is true. I just did some experimentation. If I remove Continue from the remote, it doesn't even show the continue button on the sidebar when I have my remote project opened.
Once I do install it in the remote then, the config read is the config.json file in the remote. And as shown above all points to the fact that the extension is trying to run ollama server on the remote - not locally.

@sestinj
Copy link
Contributor

sestinj commented May 13, 2024

@pmatos It does sound like you've installed in the remote. In that case you'll just need to find a way to forward the Ollama endpoint

Also (and this is less relevant assuming you have installed in remote) I notice an https:// instead of http://—does changing this have any effect? localhost should generally use http

@pmatos
Copy link
Author

pmatos commented May 13, 2024

@pmatos It does sound like you've installed in the remote. In that case you'll just need to find a way to forward the Ollama endpoint

Also (and this is less relevant assuming you have installed in remote) I notice an https:// instead of http://—does changing this have any effect? localhost should generally use http

Now you are saying it have installed in the remote. To be honest I am confused. At some point it was said it always runs locally even in remote projects but as mentioned I don't think that's true.

The remote is a tiny board where I don't want to run llms. So if continue runs on the remote, I will have to find a way for ollama to run locally on my pc. I will do more testing tomorrow.

@sestinj
Copy link
Contributor

sestinj commented May 14, 2024

Yes, sorry for the confusion. What I think I said that was misleading before was this. It "by default" runs locally, but it is also possible to install in remote. And perhaps something even more odd is happening and it installed by default in remote for you. That last part I am unsure of.

@pmatos
Copy link
Author

pmatos commented May 15, 2024

Yes, sorry for the confusion. What I think I said that was misleading before was this. It "by default" runs locally, but it is also possible to install in remote. And perhaps something even more odd is happening and it installed by default in remote for you. That last part I am unsure of.

Ah yes, that makes sense. Maybe I misunderstood. Apologies from my side as well. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

7 participants