You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
SInce the ollama api returns the raw tokens of the context, counting these to display a raw context length in the generation info could be useful for those wanting to test large context models/see exactly how many tokens the model used.
Edit: Misunderstood Ollama api
The text was updated successfully, but these errors were encountered:
SInce the ollama api returns the raw tokens of the context, counting these to display a raw context length in the generation info could be useful for those wanting to test large context models/see exactly how many tokens the model used.
Edit: Misunderstood Ollama api
The text was updated successfully, but these errors were encountered: