I am just about ready to put my computer out in the rain. Been at this stupid agentic workflow for what seems an eternity.
N8n with Claude, should have been relatively simple JOB:
Client request:
They use squarespace.
Ecom site.
Wanted ‘quick product data and import’.
Simple, right?
No.
The plan was 4 parts:
Create automation to convert the clients various csv to match template for squarespace ✅ done.
Automate google sheets repository for versioning and logical ordering of client data ✅ done.
Add automated markup parameters for their pricing vs supplier ✅ done.
Create agentic fuzzy Scrape all the product data and images from their supplier site NOT TICK 😡🤬😡 not done. Grrrr.
Create automation to enrich description content ✅ done.
Create automation to push products to auto populate site and check for changes, stick levels, errors ✅ done.
Claude and I and firecrawl have gone round and round, with n8n mcp, firecrawl mcp.
Claude can scrape it all just fine and plonk it in the sheet.
But as I explained, if either I or Claude are manually doing it, it’s not automated, uses a tonne of tokens, and can t be replicated if we are not on hand.
Stupid supplier sites are dynamic, big commerce product cdn. And somewhere between the Claude node and firecrawl parse then scrape nodes, errors.
Every. Single. Time - fifty million times it feels like.
Help.