-
Notifications
You must be signed in to change notification settings - Fork 168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
internvl-chat-v1.5-int8 推理时报错,应该如何处理 #949
Comments
完整报错是什么? |
这是完整的报错 |
看起来好像是bnb库的问题 参考下bnb的issue? 比如TimDettmers/bitsandbytes#538 oobabooga/text-generation-webui#379 |
Describe the bug
What the bug is, and how to reproduce, better with screenshots(描述bug以及复现过程,最好有截图)
可以正常启动
CUDA_VISIBLE_DEVICES=0 swift infer --model_type internvl-chat-v1_5-int8 --model_id_or_path /home/tione/notebook/community/scan/InternVL-Chat-V1-5-int8/ --dtype bf16
但是推理时报错
internvl-chat-v1_5可以正常启动和推理
CUDA_VISIBLE_DEVICES=0 swift infer --model_type internvl-chat-v1_5 --model_id_or_path /home/tione/notebook/community/scan/InternVL-Chat-V1-5/ --dtype bf16
Your hardware and system info
Write your system info like CUDA version/system/GPU/torch version here(在这里给出硬件信息和系统信息,如CUDA版本,系统,GPU型号和torch版本等)
torch2.1
py3.10
cuda12.1
Additional context
Add any other context about the problem here(在这里补充其他信息)
The text was updated successfully, but these errors were encountered: