Backends Supported with llama.cpp #6998
Unanswered
lalith1403
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In addition to BLAS, llama.cpp supports various backends such as Vulkan, Kompute, and SYCL. Wanted to understand which backend is most suitable for Low Precision Inference on CPUs.
Beta Was this translation helpful? Give feedback.
All reactions