Replies: 1 comment 1 reply
-
Welcome back! Which blog post led you back? Yes, since v2.0, local embedding model is the default. In v2.1, you can configure using local chat model providers like Ollama. With these two updates, you can run smart connections with 100% local models without anything sent to a remote API service 🌼 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi!
I did some experimentation with Smart Connections during the fall. Since I have a mix of notes that can and can not be shared with a LLM/embedding model running in the cloud, I never could put it to use in my main vault and eventually forgot about the plugin. But I was just reminded of it in a blog post I read, and came back here to see what has happened since. :)
Is it correct that Smart Connections now can be run with local models only, without anything sent to a remote API/service? Is that even the default setting?
/Anders
Beta Was this translation helpful? Give feedback.
All reactions