Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Internal server error for role orders in LLM inference #113

Open
kaxap opened this issue Apr 30, 2024 · 0 comments
Open

Internal server error for role orders in LLM inference #113

kaxap opened this issue Apr 30, 2024 · 0 comments

Comments

@kaxap
Copy link

kaxap commented Apr 30, 2024

Hi,

I am getting 500s with description

chat messages must alternate roles between 'user' and 'assistant'.  Message may have a leading 'system' role message

and

"Internal Server Error\",\"status\":500,\"detail\":\"Last message role should be 'user'

I think this order validation is unnecessary. Most of the models (e.g. mixtrals, llama3, gemma etc) are perfectly fine with any order of roles. This claim can be validated on groq's playground: https://console.groq.com/playground

This issue currently breaks some existing patterns, like continuations (without the user explicitly saying "continue") or in some cases running agents with observations etc.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant