I stopped hitting Claude's usage limits - 10 things I changed
Most people blame Claude for strict limits. I blamed Claude too. Recently I realized that Claude doesn't count the number of messages. it counts tokens. All you need to do is use tokens wisely, but not everyone knows how to do that and ends up losing a lot of tokens and money as a result. I got really into this and put together a list of the best habits that will save you a ton of tokens. 1. Edit your prompt. Don't send a follow-up When Claude doesn't get your thoughts right, you might feel tempted to send: 1/ āNo, I meant [your message]ā 2/ āUgh, that's not what I wanted [your message]ā and so on Don't do that! Every subsequent message is added to the conversation history. Claude re-reads ALL of it every turn - burning tokens on context that didn't even help. Token cost per message = all previous messages + your new one. Total = S Ć N(N+1) / 2 (S = avg tokens per exchange, N = message count) At ~500 tokens per exchange: 5 messages: 7.5K tokens 10 messages: 27.5K tokens 20 messages: 105K tokens 30 messages: 232K tokens Message 30 costs 31x more than message 1 Instead: click Edit on your original message ā fix it ā regenerate. The old exchange gets replaced, not stacked. 2. Start a fresh chat every 15ā20 messages In the previous section, I showed how token costs grow with every message. Ideally, you should start a new chat every 15ā20 messages. Now imagine a chat with 100+ messages. At ~500 tokens per exchange, that's over 2.5 million tokens burned - most of it just re-reading old history. One developer tracked his usage and found that 98.5% of tokens were spent on re-reading the history. Only 1.5% went toward actually outputting the result. When a chat gets long ā ask Claude to summarize everything ā copy it ā new chat ā paste as first message. 3. Batch your questions into one message Many people believe that splitting questions into separate messages leads to better results. Almost always, the opposite is true. Three separate prompts = three context loads.