Activity
Mon
Wed
Fri
Sun
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
Feb
Mar
What is this?
Less
More

Memberships

Agent Zero

2.4k members • Free

5 contributions to Agent Zero
Why this coming always in V.98
litellm.exceptions.MidStreamFallbackError: litellm.Servic... litellm.exceptions.MidStreamFallbackError: litellm.ServiceUnavailableError: litellm.MidStreamFallbackError: litellm.APIConnectionError: APIConnectionError: OpenAIException - Context size has been exceeded. Original exception: APIConnectionError: litellm.APIConnectionError: APIConnectionError: OpenAIException - Context size has been exceeded. Traceback (most recent call last): Traceback (most recent call last): File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 1812, in __anext__ async for chunk in self.completion_stream: File "/opt/venv-a0/lib/python3.12/site-packages/openai/_streaming.py", line 147, in __aiter__ async for item in self._iterator: File "/opt/venv-a0/lib/python3.12/site-packages/openai/_streaming.py", line 193, in __stream__ raise APIError( openai.APIError: Context size has been exceeded. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 1996, in __anext__ raise exception_type( ^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2328, in exception_type raise e # it's already mapped ^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 569, in exception_type raise APIConnectionError( litellm.exceptions.APIConnectionError: litellm.APIConnectionError: APIConnectionError: OpenAIException - Context size has been exceeded. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/a0/agent.py", line 454, in monologue agent_response, _reasoning = await self.call_chat_model( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/agent.py", line 808, in call_chat_model response, reasoning = await model.unified_call(
Memory issue for 0.98
litellm.exceptions.BadRequestError: litellm.BadRequestError: Lm_studioException - Error code: 400 - {'error': 'Context size has been exceeded.'} Traceback (most recent call last): Traceback (most recent call last): File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 823, in acompletion headers, response = await self.make_openai_chat_completion_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 190, in async_wrapper result = await func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 454, in make_openai_chat_completion_request raise e File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 436, in make_openai_chat_completion_request await openai_aclient.chat.completions.with_raw_response.create( File "/opt/venv-a0/lib/python3.12/site-packages/openai/_legacy_response.py", line 381, in wrapped return cast(LegacyAPIResponse[R], await func(*args, **kwargs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 2589, in create return await self._post( ^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/openai/_base_client.py", line 1794, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/openai/_base_client.py", line 1594, in request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': 'Context size has been exceeded.'}
0 likes • Feb 13
litellm.exceptions.BadRequestError: litellm.BadRequestError: Lm_studioException - Error code: 400 - {'error': 'Context size has been exceeded.'} Traceback (most recent call last): Traceback (most recent call last): File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 823, in acompletion headers, response = await self.make_openai_chat_completion_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 190, in async_wrapper result = await func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 454, in make_openai_chat_completion_request raise e File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 436, in make_openai_chat_completion_request await openai_aclient.chat.completions.with_raw_response.create( File "/opt/venv-a0/lib/python3.12/site-packages/openai/_legacy_response.py", line 381, in wrapped return cast(LegacyAPIResponse[R], await func(*args, **kwargs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 2589, in create return await self._post( ^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/openai/_base_client.py", line 1794, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv-a0/lib/python3.12/site-packages/openai/_base_client.py", line 1594, in request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': 'Context size has been exceeded.'}
0 likes • Feb 13
Tell me how to fix this or the dev need to look at it?
Agent Zero v0.9.8 is LIVE! (Claude Skills, Git Projects, New UI) 🚀
Hey everyone, The testing phase is over. Agent Zero v0.9.8 is now live on the main branch and available on Docker! This is a massive update that changes how you interact with the agent and helps you get more from it through Skills and Git projects. We also updated the documentation stack with better guides and tutorials. Watch the full breakdown video, and let us know what you think of the new A0 in the comments. If you like the project, subscribe to our YouTube channel to help us grow! See you in our Community Call to celebrate the release together. In the Agent Zero Skool, at 4PM UTC. Save the link: https://www.skool.com/live/DlyvNKHbyWw Join us today.
1 like • Feb 11
exiting to test. please do a video for mcp usage of agent zero
Do something else! loop after Message misformat?
Hi, I am trialing GLM 4.6 on Venice beta and normally it was working better than anything else yesterday, but today it seems to be caught in a loop of some sort when I ask it to summarize the community call. See attached video. Anyone have any hacks or workaround for this situation? Thanks a lot! :D A0: Message misformat, no valid tool request found. network_intelligence A0: Generating... minimize expand You have sent the same message again. You have to do something else! network_intelligence A0: Generating... minimize expand You have sent the same message again. You have to do something else!
Do something else! loop after Message misformat?
0 likes • Jan 20
i have the same thing- You have sent the same message again. You have to do something else!- what do we do to overcome this issue. i use lm studio- gpt-oss-20b. is there any other configuration missing here? please let me know.
You have sent the same message again. You have to do something else!
why below search term coming after few tries "You have sent the same message again. You have to do something else!"
1-5 of 5
Sanath Weera
2
13points to level up
@sanath-weera-8603
agent zero enthusiastic

Active 31d ago
Joined Jan 20, 2026
Powered by