-
Notifications
You must be signed in to change notification settings - Fork 808
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot connect to ollama server on remote #1145
Comments
I'm having a very similar problem, but with IntelliJ: #1136. |
+1 |
1 similar comment
+1 |
@pmatos I want to make sure I understand what you're referring to as the remote. Does this mean that you are using Remote SSH in VS Code? If so, Continue runs by default as a UI extension, so it runs on your laptop rather than in the remote with the code. So in this case you do not need to connect to butterfly, but just to localhost. |
@readmodifywrite @gokulkgm Can you share more about what exactly your setups look like? Given that I wasn't perfectly clear on the original issue's set up I want to make sure there aren't different details about your situations. Looking to debug this as soon as I have more information! |
@sestinj I had an issue connecting to ollama host (forgot what was the exact error). Then on an another issue it was mentioned to use pre-release version of the extension. With that I was able to connect to ollama api. |
I see - that was not my understanding. My understanding was that when you're in a remote project, extensions are installed on the remote. |
+1 |
0.8.27 fixed this for me...previously anything newer than 0.8.23 would not work as I outlined here -> #1215 (comment) |
@xndpxs @pmatos is this solved now for y'all in 0.8.27 as for ahoplock? Also just for reference of what I was mentioning about running on local vs remote: https://code.visualstudio.com/api/advanced-topics/remote-extensions This honestly is a point of discussion right now, so if you have strong feelings about where the extension ought to run I'm open to hearing! Primary concern with moving to remote is that if you wanted to run an LLM locally, you'd have to go through extra trouble to expose your machine's localhost to the remote server |
@sestinj Yes, In my case it was a ollama variable problem. |
It is still not working here.
I keep getting:
If I try a curl request on the command line, it works:
|
Here's a screenshot of what I see as a problem: If I don't even setup the apibase, i.e. localhost:11343 for ollama, it offers me to download ollama. But I don't need that because it's already running as you can see in the terminal. It feels there's some confusion between what's running locally and remotely. |
I don't understand how this is true. I just did some experimentation. If I remove Continue from the remote, it doesn't even show the continue button on the sidebar when I have my remote project opened. |
@pmatos It does sound like you've installed in the remote. In that case you'll just need to find a way to forward the Ollama endpoint Also (and this is less relevant assuming you have installed in remote) I notice an https:// instead of http://—does changing this have any effect? localhost should generally use http |
Now you are saying it have installed in the remote. To be honest I am confused. At some point it was said it always runs locally even in remote projects but as mentioned I don't think that's true. The remote is a tiny board where I don't want to run llms. So if continue runs on the remote, I will have to find a way for ollama to run locally on my pc. I will do more testing tomorrow. |
Yes, sorry for the confusion. What I think I said that was misleading before was this. It "by default" runs locally, but it is also possible to install in remote. And perhaps something even more odd is happening and it installed by default in remote for you. That last part I am unsure of. |
Ah yes, that makes sense. Maybe I misunderstood. Apologies from my side as well. :) |
Before submitting your bug report
Relevant environment info
Description
I have a laptop ("butterfly"), where
ollama serve
is installed and models are pulled, but my project is remote. I configured tab completions with ollama as suggested (config in butterfly):and yet, I get failures connecting to http://butterfly:11434 - which I assume is happening from the remote since that's where the code lives. But from the remote this works:
Unfortunately I haven't found a way to see debug logs for continue.
To reproduce
No response
Log output
The text was updated successfully, but these errors were encountered: