User
Write something
🔒 Q&A w/ Nate is happening in 33 hours
Day 2 of the AIS Challenge! 🚀
Keeping the momentum going! Today was all about data extraction and exploring the synergy between Claude Code and Firecrawl. - What I scraped: I successfully extracted structured data from Daily Remote website for job postings with keyword AI Remote Jobs. - Key takeaway: I was impressed by how Claude Code intelligently selected the specific Firecrawl tool for the job. It didn’t just guess; it seemed to analyze the page structure first and chose the extraction method that would minimize "noise" and return the cleanest JSON before converting it to my CSV. - Use case idea: This is a game-changer for competitor analysis and SEO research. I could use this to scrape trending topics or service offerings from top-tier agencies to ensure my own business strategies and content remain ahead of the curve. Getting the MCP server connected and seeing that first scrape run successfully feels like a massive win. Ready for Day 3! 🛠️ #AISChallenge
Welcome! Introduce yourself + share a career goal you have 🎉
Let's get to know each other! Comment below sharing where you are in the world, a career goal you have, and something you like to do for fun. 😊
So last month was my best month yet.
I cleared 7k from all my work. But I still have to start at zero this month (I do have one retainer, but still). This is a major problem, because if I need to find new clients every single month, I'm kind of cooked because my earnings will fluctuate between months. I know that this is not a real problem per se. But relationship building takes time, trust takes time. So in most cases you'd rather keep building a relationship with the same client for years. While the opposite of this is a lot of one-offs, which do not grow the business. So to try to mitigate this problem this month, I'll go over all my past work so that I can figure out what else I can offer to the clients I've worked with so far. Because the fact of the matter is that it's far easier to upsell an old client than to try to sell a new one. But you need to be savvy about how you do this, because you need to know what other problems they have. And then figure out how you can solve it for them. I could also think about what problems will come up once I've solved a problem for them. If I sell a WhatsApp lead gen machine, what is the next logical problem that might come up? Follow-ups. So I could upsell a follow-up system so the business can convert more leads. I guess these are the things you're not thinking about while you're in it. But it sure helps to think about them now and then.
You don't realize this until it's too late.
Here's the most annoying thing I had to deal with when it comes to n8n: scraping data. And I'm not talking about 10 items or even 100. I'm talking about scraping 33,000 zip codes in the US. Now when I first received this project, I thought easy peasy. Use Apify, connect it with n8n and start scraping, right? How wrong I was. To lay out the scene. I built an entire system of 20 flows to scrape the data, clean it, process it, and deliver it in email form to the client. And it was working when I was scraping at a low volume. But once we increased it? Well, take a look at this. If you didn't know, each node in n8n will hold all the data until the entire flow is done. So when I brought in 1,000 rows of zip codes, that meant that every single node held 1,000 items in memory. But it got worse. For every zip code we could get anything from 10 results to 1,000 results. I should mention that we were not holding back at this point, because we were scraping all the details of every single business. So the first flow was holding at times 100,000 items in every single node, which meant that we ended up with so much data that my client's entire n8n would tank. So not knowing any better, I figured, well, let's lower the volume until it works. We kept lowering it, I redesigned the entire system until we could run 100 items each run. Which worked fine. Until he decided that he wanted to add another scraper for Instagram. I'll spare you the details on this one. But long story short, having two massive scrapers was not a good idea, and we could only run one full system at a time. After some deep digging into the issue and how to fix it, I soon realized that n8n is not built for this large-scale scraping. And even if you decided to upgrade to the best server available. It would have made no difference, and queuing with Redis would not have helped either. I mean, I had already created my own queuing system inside his cloud. The lesson I learned: use n8n for simple things, because it was not meant to handle large amounts of data. Think of n8n like Zapier or Airtable. You wouldn't try to scrape data with Zapier. So the better option is to use code, something like Python for example.
A new business model?
I saw a video the other day and it kind of broke my brain. In 2026, AI agents don't browse websites. They call APIs. Cursor, Claude, Lovable, every agent framework. They all resolve to the same thing. API calls. So the businesses that win this shift are the APIs that agents keep calling. Pretty SaaS dashboards don't matter. Agents can't click buttons. A few examples that stuck with me: → Screenshot One — solo founder API that takes screenshots. Tens of thousands MRR. Doing one thing well. → Postiz — open-source social media API. $60K/month. Just an API. → Resend — email API. Integrated into thousands of codebases. Sticky as hell. The argument is API stickiness beats SaaS stickiness. SaaS users churn when a prettier tool comes along. APIs live inside someone's codebase. Ripping them out means rewriting code, retesting, redeploying. Most people don't bother. And the kicker: every AI tool I love (Cursor, Claude Code, Lovable) is just calling APIs underneath. The actual money is at the bottom of that stack. Made me think. Maybe the next wave of solo founder businesses lives in the boring infrastructure layer. Just one endpoint. One job. Done well. Honestly the more I sit with it, the more it tracks. Every tool we use is calling something else. The "calling" is the business. So what are your thoughts on it? Do you think APIs are actually about to take over SaaS in 2026, or is this just another AI hype wave?
0
0
1-30 of 9,757
AI Automation Society
skool.com/ai-automation-society
Learn to get paid for AI solutions, regardless of your background.
Leaderboard (30-day)
Powered by