Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

修复cpu环境下 module 'torch._C' has no attribute '_cuda_resetPeakMemoryStats 的错误 #914

Merged
merged 1 commit into from
May 22, 2024

Conversation

tiandiweizun
Copy link
Contributor

PR type

  • Bug Fix
  • New Feature
  • Document Updates
  • More Models or Datasets Support

PR information

cpu环境下 无cuda,但是调用了get_max_cuda_memory方法,导致报错

@Jintao-Huang Jintao-Huang merged commit c8f6153 into modelscope:main May 22, 2024
2 checks passed
tastelikefeet added a commit to tastelikefeet/swift that referenced this pull request May 24, 2024
…3_paligemma

* commit '20bc628746772836fe3838e16e87fb27c39b5ec8':
  fix val_dataset (modelscope#992)
  update custom_val_dataset (modelscope#991)
  [TorchAcc][Experimental] Integrate more model in torchacc (modelscope#683)
  fix cpu 'torch._C' has no attribute '_cuda_resetPeakMemoryStats' (modelscope#914)
  refactor readme web-ui (modelscope#983)
  support  transformers==4.41 (modelscope#979)
  support more models (modelscope#971)
tastelikefeet added a commit to tastelikefeet/swift that referenced this pull request May 28, 2024
* main: (23 commits)
  fix gr limit (modelscope#1016)
  fix minicpm-v (modelscope#1010)
  fix cogvlm2 history (modelscope#1005)
  更新了Command-line-parameters.md里面的一个链接 (modelscope#1001)
  fix template example copy (modelscope#1003)
  Feat/phi3 paligemma (modelscope#998)
  fix pt deploy lora (modelscope#999)
  fix args (modelscope#996)
  fix val_dataset (modelscope#992)
  update custom_val_dataset (modelscope#991)
  [TorchAcc][Experimental] Integrate more model in torchacc (modelscope#683)
  fix cpu 'torch._C' has no attribute '_cuda_resetPeakMemoryStats' (modelscope#914)
  refactor readme web-ui (modelscope#983)
  support  transformers==4.41 (modelscope#979)
  support more models (modelscope#971)
  Fix minicpm device map (modelscope#978)
  fix typing (modelscope#974)
  fix vllm eos_token (modelscope#973)
  Support minicpm-v-v2_5-chat (modelscope#970)
  support cogvlm2-en-chat-19b (modelscope#967)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants