You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For model providers like Groq, Local, OpenRouter, etc, each different model you might want to use inside these different services could have different function calling/tool calling mechanism and quirks: Specifically I just tried generating something with Groq+llama3 and the xml response came back in a different format.
For now, going to move away from providing direct solutions for things like Groq and make it similar to what we're doing with output_adapters where you can specify a custom AI provider in your code to use and implement the specific quirks of that model. Over time we'll find patterns that work with things like Llama3 or Mistral or Hermes2 that we can provide as bases.
The text was updated successfully, but these errors were encountered:
For model providers like Groq, Local, OpenRouter, etc, each different model you might want to use inside these different services could have different function calling/tool calling mechanism and quirks: Specifically I just tried generating something with Groq+llama3 and the xml response came back in a different format.
For now, going to move away from providing direct solutions for things like Groq and make it similar to what we're doing with
output_adapters
where you can specify a custom AI provider in your code to use and implement the specific quirks of that model. Over time we'll find patterns that work with things like Llama3 or Mistral or Hermes2 that we can provide as bases.The text was updated successfully, but these errors were encountered: