We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
On our discord people often say I visited http://localhost:11434/ and it is showing ollama is running but anythingllm is not showing ollama models.
The issue here is they entered http://localhost:11434/ on anythingllm instead of entering the http://127.0.0.1:11434/ we have mentioned this on our docs https://docs.useanything.com/anythingllm-setup/llm-configuration/local/ollama but it is on the page which is inside of multiple folder -- Setup > LLM Config > Local > Ollama.
New users might not search inside folders so it is good to have a page on FAQ section about this
The text was updated successfully, but these errors were encountered:
resolved by #33
Sorry, something went wrong.
Successfully merging a pull request may close this issue.
On our discord people often say I visited http://localhost:11434/ and it is showing ollama is running but anythingllm is not showing ollama models.
The issue here is they entered http://localhost:11434/ on anythingllm instead of entering the http://127.0.0.1:11434/ we have mentioned this on our docs https://docs.useanything.com/anythingllm-setup/llm-configuration/local/ollama but it is on the page which is inside of multiple folder -- Setup > LLM Config > Local > Ollama.
New users might not search inside folders so it is good to have a page on FAQ section about this
The text was updated successfully, but these errors were encountered: