Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KeyError: 'chatglm6b-text-generation is already registered in pipelines[chat]' #135

Open
Jingzhenzxz opened this issue Nov 16, 2023 · 1 comment
Labels
bug Something isn't working

Comments

@Jingzhenzxz
Copy link

Jingzhenzxz commented Nov 16, 2023

大佬们,我运行

cd modelscope
python3 app.py

后出现了这个错误,请问怎么解决?

Traceback (most recent call last):
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/gradio/queueing.py", line 427, in call_prediction
    output = await route_utils.call_process_api(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/gradio/route_utils.py", line 232, in call_process_api
    output = await app.get_blocks().process_api(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/gradio/blocks.py", line 1525, in process_api
    result = await self.call_function(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/gradio/blocks.py", line 1147, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/anyio/to_thread.py", line 33, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
    return await future
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 807, in run
    result = context.run(func, *args)
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/gradio/utils.py", line 672, in wrapper
    response = f(*args, **kwargs)
  File "/opt/ai/Langchain-ChatGLM-Webui/LangChain-ChatGLM-Webui/modelscope/app.py", line 154, in predict
    resp = get_knowledge_based_answer(
  File "/opt/ai/Langchain-ChatGLM-Webui/LangChain-ChatGLM-Webui/modelscope/app.py", line 126, in get_knowledge_based_answer
    result = knowledge_chain({"query": query})
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/base.py", line 310, in __call__
    raise e
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/base.py", line 304, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/retrieval_qa/base.py", line 139, in _call
    answer = self.combine_documents_chain.run(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/base.py", line 510, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/base.py", line 310, in __call__
    raise e
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/base.py", line 304, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/combine_documents/base.py", line 122, in _call
    output, extra_return_dict = self.combine_docs(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/combine_documents/stuff.py", line 171, in combine_docs
    return self.llm_chain.predict(callbacks=callbacks, **inputs), {}
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/llm.py", line 298, in predict
    return self(kwargs, callbacks=callbacks)[self.output_key]
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/base.py", line 310, in __call__
    raise e
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/base.py", line 304, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/llm.py", line 108, in _call
    response = self.generate([inputs], run_manager=run_manager)
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/chains/llm.py", line 120, in generate
    return self.llm.generate_prompt(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/llms/base.py", line 507, in generate_prompt
    return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs)
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/llms/base.py", line 656, in generate
    output = self._generate_helper(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/llms/base.py", line 544, in _generate_helper
    raise e
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/llms/base.py", line 531, in _generate_helper
    self._generate(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/langchain/llms/base.py", line 1055, in _generate
    else self._call(prompt, stop=stop, **kwargs)
  File "/opt/ai/Langchain-ChatGLM-Webui/LangChain-ChatGLM-Webui/modelscope/chatglm_llm.py", line 55, in _call
    pipe = self.pipe()
  File "/opt/ai/Langchain-ChatGLM-Webui/LangChain-ChatGLM-Webui/modelscope/chatglm_llm.py", line 68, in pipe
    pipe = pipeline(task=Tasks.chat,
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/modelscope/pipelines/builder.py", line 164, in pipeline
    return build_pipeline(cfg, task_name=task)
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/modelscope/pipelines/builder.py", line 67, in build_pipeline
    return build_from_cfg(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/modelscope/utils/registry.py", line 184, in build_from_cfg
    LazyImportModule.import_module(sig)
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/modelscope/utils/import_utils.py", line 463, in import_module
    importlib.import_module(module_name)
  File "/root/anaconda3/lib/python3.9/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
  File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 850, in exec_module
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/modelscope/pipelines/nlp/text_generation_pipeline.py", line 194, in <module>
    class ChatGLM6bTextGenerationPipeline(Pipeline):
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/modelscope/utils/registry.py", line 125, in _register
    self._register_module(
  File "/opt/ai/Langchain-ChatGLM-Webui/.env/lib/python3.9/site-packages/modelscope/utils/registry.py", line 75, in _register_module
    raise KeyError(f'{module_name} is already registered in '
KeyError: 'chatglm6b-text-generation is already registered in pipelines[chat]'

我的环境:centos7,python3.9,cuda11.8,torch2.1.0

@thomas-yanxin thomas-yanxin added the bug Something isn't working label Jan 8, 2024
@qy-70
Copy link

qy-70 commented Feb 26, 2024

(chatglm1) D:\project\python-xm\LangChain-ChatGLM-Webui\modelscope>python app.py
2024-02-26 15:18:55,166 - modelscope - INFO - PyTorch version 2.2.1 Found.
2024-02-26 15:18:55,167 - modelscope - INFO - Loading ast index from C:\Users\qyy.cache\modelscope\ast_indexer
2024-02-26 15:18:55,348 - modelscope - INFO - Loading done! Current index file version is 1.12.0, with md5 975d57f028ccfb59f897d1a43145be7b and a total number of 964 components indexed
D:\Anaconda\Anaconda\envs\chatglm1\Lib\site-packages\langchain\document_loaders_init_.py:36: LangChainDeprecationWarning: Importing document loaders from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.document_loaders import UnstructuredFileLoader.

To install langchain-community run pip install -U langchain-community.
warnings.warn(
D:\Anaconda\Anaconda\envs\chatglm1\Lib\site-packages\langchain\vectorstores_init_.py:35: LangChainDeprecationWarning: Importing vector stores from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.vectorstores import FAISS.

To install langchain-community run pip install -U langchain-community.
warnings.warn(
D:\Anaconda\Anaconda\envs\chatglm1\Lib\site-packages\pydantic_internal_fields.py:151: UserWarning: Field "model_id" has conflict with protected namespace "model_".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = ().
warnings.warn(
Traceback (most recent call last):
File "D:\project\python-xm\LangChain-ChatGLM-Webui\modelscope\app.py", line 223, in
chatbot = gr.Chatbot(label='ChatLLM').scale(height=400)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: 'NoneType' object is not callable

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants