Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistent Model Specification Behavior #269

Open
zdaar opened this issue May 10, 2024 · 0 comments
Open

Inconsistent Model Specification Behavior #269

zdaar opened this issue May 10, 2024 · 0 comments

Comments

@zdaar
Copy link

zdaar commented May 10, 2024

Title: Inconsistent Model Specification Behavior in Open Interpreter

Describe the bug
When attempting to specify a model using the --model flag in Open Interpreter, the system incorrectly handles the 'groq/' prefix by appending 'openai/' by default, which is confusing. There should be a clearer handling of prefixes.

To Reproduce
Steps to reproduce the behavior:

  1. Open the terminal.
  2. Run the command interpreter --api_base "https://api.groq.com/openai/v1" --model "groq/Mixtral-8x7b-32768".
  3. Observe that the system appends 'openai/' to the model path instead of recognizing 'groq/'.

Expected behavior
Correct handling of the provider. Groq should be supported without having to specify its api_base.

# This doesnt work, and is desired
interpreter --model "groq/llama3-8b-8192" --api_key "key_here"
# This doesnt work either
interpreter --api_base "https://api.groq.com/openai/v1" --model "groq/llama3-8b-8192" --api_key "key_here"
# This works
interpreter --api_base "https://api.groq.com/openai/v1" --model "llama3-8b-8192" --api_key "key_here"

Screenshots
image
image

image
image

Desktop (please complete the following information):

  • OS: w11
  • Python Version 3.12

Additional context

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant