Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ability to add more models + Compare outputs #76

Open
arnavmehta7 opened this issue Jun 8, 2023 · 1 comment
Open

Ability to add more models + Compare outputs #76

arnavmehta7 opened this issue Jun 8, 2023 · 1 comment

Comments

@arnavmehta7
Copy link

Proposal

  • Compare results of a single prompt among many models
  • Ability to add more models especially custom one. Something like a plain simple API request to a hosted endpoint.

Use-Case

New prompt engineers and companies have to iterate and switch regularly between models to test outputs of similar prompts. Instead a single interface can be developed in which new models can be added and their results could be compared in one go.

Is this a feature you are interested in implementing yourself?

No

@arielweinberger
Copy link
Member

Thanks for following up @arnavmehta7. As discussed, I'll leave this open to see if the community shows appetite for this feature.

My bet is that we'll see more users interested in this, with AI models popping all the time (including self-hosted ones).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants