Replies: 2 comments 3 replies
-
We can try to add this as an option but need a reference to the algorithm that performs the dimensionality reduction |
Beta Was this translation helpful? Give feedback.
-
This is how the dimensionality is resized using Sentence Transformers or HF Transformers: https://huggingface.co/nomic-ai/nomic-embed-text-v1.5#sentence-transformers I wrote a C++ implementation of this for GPT4All (which is built on llama.cpp), the important part is here: https://github.com/nomic-ai/gpt4all/blob/ac498f79ac67640281bbf85137a5c632c59400ef/gpt4all-backend/llamamodel.cpp#L871-L885 The easiest way to leverage that implementation is to |
Beta Was this translation helpful? Give feedback.
-
I'm using nomic-embed-text-v1.5 for my embedding model and it works but it's returning the full 768 dimensions. I'd like to drop this to 128 dimensions but I don't see a way to do that via llama.cpp.
Beta Was this translation helpful? Give feedback.
All reactions