Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] LOCAL EMBEDDING MODEL RUN #1117

Closed
oguzhanglb opened this issue May 16, 2024 · 1 comment
Closed

[FEATURE] LOCAL EMBEDDING MODEL RUN #1117

oguzhanglb opened this issue May 16, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@oguzhanglb
Copy link

Is your feature request related to a problem? Please describe.

Describe the solution you'd like

Hello, first of all, thank you for creating such a library.

I couldn't see an example here where we could take any embedding model and run it locally.

You have a few examples of onnx, but when I implemented it, I could not achieve it very well.
Example;
https://github.com/langchain4j/langchain4j-examples/blob/main/other-examples/src/main/java/embedding/model/InProcessEmbeddingModelExamples.java

If it can run locally, do you have a detailed example where we can run it locally?

Thank you in advance for your help.

Describe alternatives you've considered

Additional context

@oguzhanglb oguzhanglb added the enhancement New feature or request label May 16, 2024
@oguzhanglb oguzhanglb changed the title [FEATURE] [FEATURE] LOCAL EMBEDDING MODEL RUN May 16, 2024
@langchain4j
Copy link
Owner

Hi, what was the problem with onnx?

You can run embedding models locally with Ollama, here is an example:

I am closing this issue, next time please open a discussion here or on Discord.
But feel free to respond here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants