Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

升级CUDA版本以支持Windows版本的flash-attention #78

Open
1 task
SkyblueMr opened this issue Mar 12, 2024 · 1 comment
Open
1 task

升级CUDA版本以支持Windows版本的flash-attention #78

SkyblueMr opened this issue Mar 12, 2024 · 1 comment
Assignees
Labels
enhancement New feature or request

Comments

@SkyblueMr
Copy link

描述该功能

目前能编译出来的windows版本的flash-attention是依赖cu121+py310+torch2.1
而InternEvo又只依赖cu118,导致两个库冲突了,无法在windows上训练
未来会有计划升级到cu121吗?谢谢!

是否希望自己实现该功能?

  • 我希望自己来实现这一功能,并向 InternLM 贡献代码!
@SkyblueMr SkyblueMr added the enhancement New feature or request label Mar 12, 2024
@sunpengsdu
Copy link
Contributor

我们来计划一下

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants