Batching Lots Of Emails: Avoiding cloud-hosted n8n memory limitations
I’m getting my feet wet with RAG and vector stores, and wanted to create a repository of emails that I’ve written myself. That way I can have other agents draft emails more like I have. I have a workflow for grabbing a ton of sent emails, stripping them down to just what I’ve actually written (removing the “On Friday, Feb 21st at 2:00am…”, etc.), sorting them by emails I’ve sent to my team (so they have an “internal voice” to them), and emails I’ve sent to clients and agency partners (or “external voice”), and finally dropping that text into two different tables in Supabase.
The problem comes from only being able to download about 80 emails before I hit a memory limit in my cloud version of n8n. The solution seems obvious with batching the job, but a loop node requires that all emails are downloaded first before they’re split into batches.
I haven’t been able to work out how to get a Gmail node to download only the first batch, then the next, then the next, etc.The GPTs I’ve asked seem to work from an understanding of an older version of n8n Gmail nodes where you could specifically set batches, but the 1.73.4 version doesn’t have these features.
How would you approach this?
1
5 comments
Michael Cummins
2
Batching Lots Of Emails: Avoiding cloud-hosted n8n memory limitations
AI Automation Society
skool.com/ai-automation-society
A community built to master no-code AI automations. Join to learn, discuss, and build the systems that will shape the future of work.
Leaderboard (30-day)
Powered by