Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FAQ] Ollama not showing Chat Models on LLM preferences #22

Closed
ShadowArcanist opened this issue May 15, 2024 · 1 comment
Closed

[FAQ] Ollama not showing Chat Models on LLM preferences #22

ShadowArcanist opened this issue May 15, 2024 · 1 comment

Comments

@ShadowArcanist
Copy link
Sponsor Contributor

On our discord people often say I visited http://localhost:11434/ and it is showing ollama is running but anythingllm is not showing ollama models.

The issue here is they entered http://localhost:11434/ on anythingllm instead of entering the http://127.0.0.1:11434/ we have mentioned this on our docs https://docs.useanything.com/anythingllm-setup/llm-configuration/local/ollama but it is on the page which is inside of multiple folder -- Setup > LLM Config > Local > Ollama.

New users might not search inside folders so it is good to have a page on FAQ section about this

@timothycarambat
Copy link
Member

resolved by #33

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants