Skip to content

How do you use large context? #6956

Closed Answered by segmond
segmond asked this question in Q&A
Discussion options

You must be logged in to vote

I'm running out of memory. After lots of expriments, I observed that besides the model using up ram, we need space for KV storage and especially plenty for compute buffers. commandR+ seems to also be on the high side with it demands. Looks like I would need 200+gb of vram to be able to get 128k context.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by segmond
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant