Error lmstudio local models
Hi there
By following the tutorial in this video... I'm connecting the model to A0 with lmstudio... I'm getting the same error with different models...
Does anyone know the solution?
Error:
litellm.exceptions.MidStreamFallbackError: litellm.ServiceUnavailableError: litellm.MidStreamFallbackError: litellm.APIConnectionError: APIConnectionError: OpenAIException - Cannot truncate prompt with n_keep (9041) >= n_ctx (4096) Original exception: APIConnectionError: litellm.APIConnectionError: APIConnectionError: OpenAIException - Cannot truncate prompt with n_keep (9041) >= n_ctx (4096) Traceback (most recent call last): Traceback (most recent call last): File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 1812, in __anext__ async for chunk in self.completion_stream: File "/opt/venv-a0/lib/python3.12/site-packages/openai/_streaming.py", line 147, in __aiter__ async for item in self._iterator: File "/opt/venv-a0/lib/python3.12/site-packages/openai/_streaming.py", line 193, in __stream__ raise APIError( openai.APIError: Cannot truncate prompt with n_keep (9041) >= n_ctx (4096) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 1996, in __anext__ raise exception_type( ^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2328, in exception_type raise e # it's already mapped ^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 569, in exception_type raise APIConnectionError( litellm.exceptions.APIConnectionError: litellm.APIConnectionError: APIConnectionError: OpenAIException - Cannot truncate prompt with n_keep (9041) >= n_ctx (4096) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/a0/agent.py", line 454, in monologue agent_response, _reasoning = await self.call_chat_model( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/agent.py", line 808, in call_chat_model response, reasoning = await model.unified_call( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/models.py", line 511, in unified_call async for chunk in _completion: # type: ignore File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 2006, in __anext__ raise MidStreamFallbackError(
1
8 comments
Hic Hic
1
Error lmstudio local models
Agent Zero
skool.com/agent-zero
Agent Zero AI framework
Leaderboard (30-day)
Powered by