-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Phi3 fails to correctly use TEMPLATES #621
Comments
Ollama needs to be sent the raw flag set to true.... but here is the catch, Langchain will not send it so this is what I do which seems to work to pass the template. Modify the Ollama.py file in langchain-community.llms to this:
Then call it like so:
I am importing the model details from a json file so for reference here is that:
|
Use at your own risk though. Let me know how it works out. |
I'm trying to make Phi3 (3.8B) work. The model seems to have no problems using tools, but uses (different?) system messages, which leads to the chain not ending when it is supposed to:
this is the modelfile
Using
crewai 0.30.8
I copied system/prompt/responese template is from #554, but it seems that it either not used, or just doesn't work with phi3.
The text was updated successfully, but these errors were encountered: