-
Hi, everyone. I set up the AI API settings and started using the chat. When I send a simple request like 'How many columns are in the table user?', it shows the answer pretty well. But when I ask about 'How many rows are in the table user?', it doesn't work as expected.The wrenai-service encountered a "KeyError: 'table_contents'" error. File "/src/pipelines/retrieval/retrieval.py", line 314, in construct_retrieval_results
reformated_json[table["table_name"]] = table["table_contents"]
~~~~~^^^^^^^^^^^^^^^^^^
KeyError: 'table_contents' At the beginning, I wondered if there was no more memory left. So, I deployed the service on a machine with 64GB of RAM and 30GB of free memory remaining. However, the same error occurred. The table 'user' only has a total of 700 rows and is 14KB in size. So here I am, looking for some help and advice. The AI service image version I pulled was 0.15.7. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
@tatuke what llm are you using, I am thinking maybe the issue is due to that llm is not correctly generating json output |
Beta Was this translation helpful? Give feedback.
-
I have found, and can confirm, that this is an error caused by a difference in the return format of the LLM api. Waiting for a more generic LLM protocol or mapping? .x. |
Beta Was this translation helpful? Give feedback.
I have found, and can confirm, that this is an error caused by a difference in the return format of the LLM api. Waiting for a more generic LLM protocol or mapping? .x.