You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
...but it's incomplete. Observed by trying to do one of the examples from the README:
(hf) [host:~] python
Python 3.11.9 | packaged by conda-forge | (main, Apr 19 2024, 18:36:13) [GCC 12.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from intel_extension_for_transformers.neural_chat import build_chatbot
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/Anaconda3-2024.02-1/envs/hf/lib/python3.11/site-packages/intel_extension_for_transformers/neural_chat/__init__.py", line 18, in <module>
from .config import PipelineConfig
File "/opt/Anaconda3-2024.02-1/envs/hf/lib/python3.11/site-packages/intel_extension_for_transformers/neural_chat/config.py", line 24, in <module>
from .utils.common import get_device_type
File "/opt/Anaconda3-2024.02-1/envs/hf/lib/python3.11/site-packages/intel_extension_for_transformers/neural_chat/utils/common.py", line 18, in <module>
import torch
ModuleNotFoundError: No module named 'torch'
It seems you have to also install pytorch (and specifically one from the intel channel otherwise libomp.so get borked). Even then there seem to be more things left missing:
(hf) [host:~] conda install -c intel pytorch
.
.
.
(hf) [host:~] python
Python 3.11.9 | packaged by conda-forge | (main, Apr 19 2024, 18:36:13) [GCC 12.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from intel_extension_for_transformers.neural_chat import build_chatbot
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/Anaconda3-2024.02-1/envs/hf/lib/python3.11/site-packages/intel_extension_for_transformers/neural_chat/__init__.py", line 29, in <module>
from .server.neuralchat_server import NeuralChatServerExecutor
File "/opt/Anaconda3-2024.02-1/envs/hf/lib/python3.11/site-packages/intel_extension_for_transformers/neural_chat/server/__init__.py", line 22, in <module>
from .neuralchat_server import NeuralChatServerExecutor
File "/opt/Anaconda3-2024.02-1/envs/hf/lib/python3.11/site-packages/intel_extension_for_transformers/neural_chat/server/neuralchat_server.py", line 26, in <module>
import uvicorn
ModuleNotFoundError: No module named 'uvicorn'
I kept installing missing things (yacs, fastapi, shortuuid) but gave up.
Is there something I'm missing?
The text was updated successfully, but these errors were encountered:
Ah... I'm starting to understand. It wasn't obvious to me that I needed the separate neural-chat stuff. I wish there was a map because it looks like much of this is interrelated (neural-chat, neural-speed, extenstion-for-pytorch, extensions-for-transformers, etc...)
It also looks like there's no way to get everything installed from conda/intel channels and that eventually you have to resort to pip.
I tried starting with the conda install from the installation.md...
...but it's incomplete. Observed by trying to do one of the examples from the README:
It seems you have to also install pytorch (and specifically one from the intel channel otherwise libomp.so get borked). Even then there seem to be more things left missing:
I kept installing missing things (yacs, fastapi, shortuuid) but gave up.
Is there something I'm missing?
The text was updated successfully, but these errors were encountered: