You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
我已经搜索过FAQ | I have searched FAQ
当前行为 | Current Behavior
从repo找到的脚本里面:
elif quant_type == "int4":
# please install AutoGPTQ following the readme to use quantization
from auto_gptq import AutoGPTQForCausalLM
model = AutoGPTQForCausalLM.from_quantized(
"Qwen/Qwen-VL-Chat-Int4",
device="cuda:0",
trust_remote_code=True,
use_safetensors=True,
use_flash_attn=use_flash_attn
).eval()
测试会报错。
FileNotFoundError: Could not find a model in Qwen-VL-Chat-Int4 with a name in model.safetensors. Please specify the argument model_basename to use a custom file name.
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
当前行为 | Current Behavior
从repo找到的脚本里面:
elif quant_type == "int4":
# please install AutoGPTQ following the readme to use quantization
from auto_gptq import AutoGPTQForCausalLM
model = AutoGPTQForCausalLM.from_quantized(
"Qwen/Qwen-VL-Chat-Int4",
device="cuda:0",
trust_remote_code=True,
use_safetensors=True,
use_flash_attn=use_flash_attn
).eval()
测试会报错。
FileNotFoundError: Could not find a model in Qwen-VL-Chat-Int4 with a name in model.safetensors. Please specify the argument model_basename to use a custom file name.
发现有人和我遇到了一样的问题。
AutoGPTQ/AutoGPTQ#319
去看了下源代码,加载的模型是一个。不是5个呢?
那为什么做作者的测试脚本,可以使用呢?我用tansform加载预测是可以的。
期望行为 | Expected Behavior
None
复现方法 | Steps To Reproduce
None
运行环境 | Environment
备注 | Anything else?
None
The text was updated successfully, but these errors were encountered: