Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama._types.ResponseError: Unsupported type: 'bool' #157

Closed
guitmonk-1290 opened this issue May 15, 2024 · 2 comments
Closed

ollama._types.ResponseError: Unsupported type: 'bool' #157

guitmonk-1290 opened this issue May 15, 2024 · 2 comments

Comments

@guitmonk-1290
Copy link

I have this code for generating a SQL query based on my SQL database:

def run(self, query: str):
  self.vector_index_dict = self.index_all_tables()
  
  table_schema_objs = self.obj_retriever.retrieve(query)
  for table in table_schema_objs:
      print(f"[matched_table]: {table.table_name}")
  
  context_str = self.get_table_context_and_rows_str(query_str=query, table_schema_objs=table_schema_objs)
  context_str += f"Write a SQL query for this user query and nothing else: '{query}'"
  print(f"[LLM_CONTEXT_STR]: {context_str}")
  
  
  response = ollama.chat(
      model='stablelm-zephyr',
      messages=[{'role': 'user', 'content': context_str}],
      stream=False,
      keep_alive=True
  )

However I am encountering this error:

response = ollama.chat(
               ^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 177, in chat
    return self._request_stream(
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 97, in _request_stream
    return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 73, in _request
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: Unsupported type: 'bool'

Previously there was no such error before updating ollama. I'm not sure what is the problem.

Thanks.

@ruaanviljoen
Copy link

ruaanviljoen commented Jun 2, 2024

I have this code for generating a SQL query based on my SQL database:

def run(self, query: str):
  self.vector_index_dict = self.index_all_tables()
  
  table_schema_objs = self.obj_retriever.retrieve(query)
  for table in table_schema_objs:
      print(f"[matched_table]: {table.table_name}")
  
  context_str = self.get_table_context_and_rows_str(query_str=query, table_schema_objs=table_schema_objs)
  context_str += f"Write a SQL query for this user query and nothing else: '{query}'"
  print(f"[LLM_CONTEXT_STR]: {context_str}")
  
  
  response = ollama.chat(
      model='stablelm-zephyr',
      messages=[{'role': 'user', 'content': context_str}],
      stream=False,
      keep_alive=True
  )

However I am encountering this error:

response = ollama.chat(
               ^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 177, in chat
    return self._request_stream(
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 97, in _request_stream
    return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\aditya\source\repos\Spectra Tech\Chatbot\flask\.venv\Lib\site-packages\ollama\_client.py", line 73, in _request
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: Unsupported type: 'bool'

Previously there was no such error before updating ollama. I'm not sure what is the problem.

Thanks.

I had the same issue. The problem is that your keep_alive parameter is failing. I didn't dig into why it worked before, but if you look at the chat function definition, the types supported are not Boolean. keep_alive is used to configure the duration and behaviour, it isn't a flag (anymore?). For valid options (seems like float and str for now), see the faq.md here https://github.com/ollama/ollama/blob/d4a86102fd5f84cca50757af00296606ac191890/docs/faq.md?plain=1#L237

I updated mine and works just fine now.

@guitmonk-1290
Copy link
Author

@ruaanviljoen ahh I see. So it's like a configuration instead of a flag now. Thanks a bunch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants