Hello everyone, I’m currently facing an issue with Agent Zero. My LLM provider is OpenRouter, and I still have available credits on my account. However, I keep getting the following error: litellm.exceptions.RateLimitError / OpenAIException - You exceeded your current quota, please check your plan and billing details From my understanding, Agent Zero seems to be trying to use the OpenAI API directly instead of routing everything through OpenRouter. In my configuration: - Main model: OpenRouter / anthropic/claude-sonnet-4.6 - Utility model: OpenRouter / cognitivecomputations/dolphin-mistral-24b-venice-edition:free Could someone please help me understand why this is happening and how to fix it? Thanks in advance. Version un peu plus directe et naturelle : Hello everyone, I’m having an issue with Agent Zero. My provider is OpenRouter, and I still have enough credits, but I keep getting this error saying that I exceeded my OpenAI quota. It looks like Agent Zero may be calling OpenAI directly instead of using OpenRouter for all requests. My current config is: - Main: OpenRouter / anthropic/claude-sonnet-4.6 - Utility: OpenRouter / cognitivecomputations/dolphin-mistral-24b-venice-edition:free Has anyone faced this before or knows how to fix it? Thanks. Le deuxième est meilleur pour Discord, GitHub ou forum support. litellm.exceptions.RateLimitError: litellm.RateLimitError: RateLimitError: OpenAIException - You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors. Traceback (most recent call last): Traceback (most recent call last): File "/opt/venv-a0/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 991, in async_streaming headers, response = await self.make_openai_chat_completion_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^