You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I had success running Mistral-7B-v0.1-AWQ and CodeLlama-7B-AWQ of TheBloke on an A6000 with 48G VRAM, restricted to ~8G VRAM with the following parameters:
nvidia-smi then shows around 8G memory consumed by the python process, should run on the 3060 as well I hope (need to omit the --gpu-memory-utilization obviously).
Looking for help from 2 communities 馃槃 thx!
The text was updated successfully, but these errors were encountered: