Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Testing in a macbook pro #2

Open
alonsoir opened this issue Jul 1, 2023 · 2 comments
Open

Testing in a macbook pro #2

alonsoir opened this issue Jul 1, 2023 · 2 comments

Comments

@alonsoir
Copy link

alonsoir commented Jul 1, 2023

Thanks for sharing the model, I have been able to test it on my macbook pro, i9 with 32 GB of ram. I notice that the cpu goes to 400% when inferring the answer, and the gpu goes to 0%. Is it possible to make the model use the gpu? Radeon Pro Vega 20 4GB

@TouchstoneTheDev
Copy link

The required package sentence-transformers is available. There is a hyphen instead of underscore in the package name.
Currently the model is the quantized version of the mpt-30b-chat model which is being loaded with Ctransformers. You can use the original huggingface llm with this code

import transformers
llm = transformers.AutoModelForCausalLM.from_pretrained(
  'mosaicml/mpt-30b-chat',
  trust_remote_code=True
)

You should replace lines 73 to 78 in question_answer_docs.py with the above code. It should work fine ( I haven't tried) with GPU if you have the GPU which can hold the 30b parameter model.

refer this "#1 (comment)"

@alonsoir
Copy link
Author

Hi!, i have tried with your suggestion, but i am getting this error:

pip --version
pip 23.2.1 from /usr/local/lib/python3.11/site-packages/pip (python 3.11)
python3.11 -m pip install --upgrade pip
Requirement already satisfied: pip in /usr/local/lib/python3.11/site-packages (23.2.1)
python3.11 -m pip install einops
Requirement already satisfied: einops in /usr/local/lib/python3.11/site-packages (0.6.1)
make server
poetry run python question_answer_docs_server.py
embeddings_model_name is all-MiniLM-L6-v2
persist_directory is db
model_path is models/mpt-30b-chat.ggmlv0.q4_1.bin
target_source_chunks is 4
Initializing model for the first time...
Loading model... models/mpt-30b-chat.ggmlv0.q4_1.bin
This modeling file requires the following packages that were not found in your environment: einops. Run pip install einops
Traceback (most recent call last):
File "/Users/aironman/git/private-chatbot-mpt30b-langchain/question_answer_docs_server.py", line 139, in
init()
File "/Users/aironman/git/private-chatbot-mpt30b-langchain/question_answer_docs_server.py", line 127, in init
llm=load_model(),
^^^^^^^^^^^^
File "/Users/aironman/git/private-chatbot-mpt30b-langchain/question_answer_docs_server.py", line 58, in load_model
llm = transformers.AutoModelForCausalLM.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/aironman/Library/Caches/pypoetry/virtualenvs/private-chatbot-mpt30b-aREpNEbB-py3.11/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 475, in from_pretrained
model_class = get_class_from_dynamic_module(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/aironman/Library/Caches/pypoetry/virtualenvs/private-chatbot-mpt30b-aREpNEbB-py3.11/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 431, in get_class_from_dynamic_module
final_module = get_cached_module_file(
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/aironman/Library/Caches/pypoetry/virtualenvs/private-chatbot-mpt30b-aREpNEbB-py3.11/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 306, in get_cached_module_file
get_cached_module_file(
File "/Users/aironman/Library/Caches/pypoetry/virtualenvs/private-chatbot-mpt30b-aREpNEbB-py3.11/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 268, in get_cached_module_file
modules_needed = check_imports(resolved_module_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/aironman/Library/Caches/pypoetry/virtualenvs/private-chatbot-mpt30b-aREpNEbB-py3.11/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 151, in check_imports
raise ImportError(
ImportError: This modeling file requires the following packages that were not found in your environment: einops. Run pip install einops
make: *** [server] Error 1

i have tried to install einops, but, as you can see i am using poetry

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants