Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to run inference on Mac studio (M2) #367

Closed
ngzk-tsubasa opened this issue May 6, 2024 · 1 comment
Closed

How to run inference on Mac studio (M2) #367

ngzk-tsubasa opened this issue May 6, 2024 · 1 comment

Comments

@ngzk-tsubasa
Copy link

Hi, guys,
I tried to perform inference on my Mac Studio, which has an M2 chip.
However, I encountered a problem installing xformers.

File "/Users/user/anaconda3/envs/opensora/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1783, in _write_ninja_file_and_compile_objects
_run_ninja_build(
File "/Users/user/anaconda3/envs/opensora/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 2123, in _run_ninja_build
raise RuntimeError(message) from e
RuntimeError: Error compiling objects for extension
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for xformers
Running setup.py clean for xformers
Failed to build xformers
ERROR: Could not build wheels for xformers, which is required to install pyproject.toml-based projects

Are there any other ways that I can perform inference on my device?
Thank you!

@zhengzangw
Copy link
Collaborator

M2 is currently not supported.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants