Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Show tokens consumption and estimated cost for chat and multi-chat playground #410

Open
brnaba-aws opened this issue Mar 8, 2024 · 1 comment
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@brnaba-aws
Copy link

brnaba-aws commented Mar 8, 2024

Currently, there is no indication about the cost related to the call of an LLM.
It would be interesting to understand how much the message affects the cost of invoking a LLM but it would be even more interesting when comparing models one to another.
In this picture, the cost could be displayed at the bottom, but it could also be part of the metadata section.

image
@bigadsoleiman bigadsoleiman changed the title add execution cost for chat and multi-chat playground Show tokens consumption and estimated cost for chat and multi-chat playground Mar 18, 2024
@bigadsoleiman bigadsoleiman added enhancement New feature or request help wanted Extra attention is needed labels Mar 18, 2024
@ystoneman
Copy link

Great idea, @brnaba-aws!

  1. Do you think it would be useful to toggle this "cost insights" feature off and on in the settings?
  2. Do you think it would be useful to show both the individual message cost (like in your awesome mockup) and the overall conversation cost?
  3. Do you have any favorite frameworks/libraries that you'd recommend for implementing components of this feature?

Please feel free to contribute a draft pull request if you'd like to work on this -- all contributions welcome! And I'm ready to support you in that process.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
Status: Todo
Development

No branches or pull requests

3 participants