Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add more settings to CompletionOptions #13

Open
sestinj opened this issue Sep 4, 2023 · 0 comments
Open

Add more settings to CompletionOptions #13

sestinj opened this issue Sep 4, 2023 · 0 comments
Labels
enhancement New feature or request good first issue Good for newcomers

Comments

@sestinj
Copy link
Contributor

sestinj commented Sep 4, 2023

Every LLM completion is passed a set of parameters in the CompletionOptions object.

We currently support common settings like max_tokens, temperature, top_p, top_k, frequency_penalty, and presence_penalty, but are missing things like tail-free sampling or certain mirostat parameters.

Some model providers, like llama.cpp will accept these, and so it is only a matter of allowing the parameter to be passed in.

  1. Update CompletionOptions to have the parameter
  2. Many model providers (all are in the core/llm/llms folder) have a function called _convertArgs that turns the CompletionOptions object into the request body expected by their API. For the providers taht support this parameter, make sure that it gets passed to the request. For other providers, take a look to make sure that this extraneous parameter doesn't get sent in the request.
@sestinj sestinj added the enhancement New feature or request label Sep 4, 2023
@TyDunn TyDunn added the good first issue Good for newcomers label Jan 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers
Projects
Status: Good First Issues (Code)
Development

No branches or pull requests

2 participants