Issues: mlc-ai/mlc-llm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
执行mlc_chat命令时,提示tvm模块找不到。
question
Question about the usage
#2356
opened May 17, 2024 by
wangmiaojun
[Question] How do you convert .bin files to wasm. Also where are TVM_HOME and MLC_HOME located?
question
Question about the usage
#2355
opened May 17, 2024 by
justrach
[Question] Single forward pass through ChatModule
question
Question about the usage
#2354
opened May 17, 2024 by
caenopy
[Feature Request] Implement AttentionStore
feature request
New feature or request
#2353
opened May 16, 2024 by
kripper
[Doc] Cant install mlc
documentation
Improvements or additions to documentation
#2352
opened May 16, 2024 by
abpani
[Question] mlc_llm serve fails with --speculative-mode, does it require certain hardware?
question
Question about the usage
#2350
opened May 16, 2024 by
0xDEADFED5
[Question] Can MLC quantize multimodal models?
question
Question about the usage
#2349
opened May 16, 2024 by
LJ-Hao
[Bug] Can't finish the build process on windows
bug
Confirmed bugs
#2346
opened May 15, 2024 by
jeanhubdesv
[Question] Can not get chat CLI working, throwing error after cloning model
question
Question about the usage
#2339
opened May 14, 2024 by
BeytoA
[Question] Deployment of Pruned Models
question
Question about the usage
#2338
opened May 14, 2024 by
qianjyM
Could not find org.apache.tvm:tvm-android:0.1.0.
question
Question about the usage
#2333
opened May 13, 2024 by
viaowp
[Question] Parallel computations using multiple streams?
question
Question about the usage
#2332
opened May 13, 2024 by
taegeonum
[Bug] InternalError: Check failed: (res == VK_SUCCESS) is false: Vulkan Error, code=-4: VK_ERROR_DEVICE_LOST
bug
Confirmed bugs
#2328
opened May 11, 2024 by
aaaaaad333
[Tracking] Create a CPU Compatible PagedKVCache
status: tracking
Tracking work in progress
#2325
opened May 11, 2024 by
tqchen
1 task
[Tracking] Sentence Embedding Model
status: tracking
Tracking work in progress
#2324
opened May 11, 2024 by
tqchen
1 task
[Bug] mlc_llm package failed once, and i cant run it again
bug
Confirmed bugs
#2323
opened May 11, 2024 by
CallMeTkt
[Feature Request] Medusa support
feature request
New feature or request
#2319
opened May 10, 2024 by
EmilioZhao
[Bug] Support multiple "system" messages in REST API
bug
Confirmed bugs
#2311
opened May 10, 2024 by
bayley
[Bug] mlc-llm not working, tvm check returns none
bug
Confirmed bugs
#2301
opened May 9, 2024 by
CallMeTkt
[Bug] REST server doesn't work on V100 (SM70) - cudaErrorNoKernelImageForDevice (but chat works)
bug
Confirmed bugs
#2296
opened May 8, 2024 by
bayley
Previous Next
ProTip!
Add no:assignee to see everything that’s not assigned.