Activity
Mon
Wed
Fri
Sun
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
Feb
Mar
Apr
What is this?
Less
More

Memberships

NextGen AI

26.6k members • Free

Online Business Friends

91.7k members • Free

The Transurfing Skool

3.8k members • Free

Chase AI+

1.1k members • $98/month

🏛️ Coaching Academy

3k members • Free

Amazon FBA | Start & Grow

910 members • Free

Global Digital Nomads

4.5k members • Free

AI Growth Lab

58 members • Free

AI Video Bootcamp

18.6k members • $9/month

4 contributions to The AI Advantage
Real Estate Lead Scraper for property I'm selling
Hey everyone, I have a property I'm selling in the Nashville area and want to build an automation to scrape for potential buyers and then do a couple things: 1. Drop those leads into a spreadsheet 2. Do an automated email & SMS followup 3. Book qualified leads into a call Is there an automation here I can use? If not, any ideas on how to do this? The biggest challenge is where the automation should scrape for leads sources Thanks!
1 like • 51m
@Liliant Cannon thanks for the help! So… how do I do that?
1 like • 34m
@Chase Aldridge They're unimproved parcels, which is just land with no buildings, water, electric, etc. So it would either be investors who want to develop it, or a homeowner who wants an empty plot to build on.
How Claude Code compressed 2 weeks into 2 hours
I build trading systems and need a ton of backtesting data... 10 years each for forex, options and stocks/ETFs. The fastest way to grab it before was to go to massive.com and download the flat files. And that was the problem... bc it meant downloading compressed files for each day over the last 10 years. That's about 6,000 files. Each file contains the price data for every symbol... ie. for Apr 8, 2026, that .csv file has the price data for over 5,000 stocks & ETF's, in 1min timeframe. So that's over 3m rows. Which means to get only the symbols I need, I'd have to go through each file and copy-paste. Instead, I had Claude Code write me a script that: 1. Logged into my account 2. Downloaded all the files 3. Put them in folders by asset/year/month 4. Extracted only the symbols I wanted and put those CSV's into their own folders 5. Combined the daily .csv's into monthly files 6. Resampled them into the timeframes I needed 7. For options, it then calculated a bunch of other stuff I needed and appended those to the CSV's. All in, it took about 2 hours to build the scripts. I just run them one at a time and forget about it.
0 likes • 19d
@AI Advantage Team It got most of it right, but I had to iterate a bit. I've been working with AI for a while, so I've gotten pretty good at using it for stuff like this. The biggest challenge is that it was essentially acting as an agent to login to my account. It first tried via username and password; but after about 4-5 tries, it figured out that it needed my API key. Once it got that, it downloaded everything in about an hour. We're talking about 150-200GB of data. The old way involved using API calls to grab individual symbols. And just grabbing one symbol for the same time range took more time than it took to grab 5,000 using this method.
Pulling YouTube transcripts for research
Hey guys, I put together a cool automation for those of you who want to use YouTube videos for research. This allows you to scrape the transcripts of YouTube videos to then dump them into AI to analyze for you. This isn't about ripping off the video; it's simply about a much more efficient way to take in this free content and use AI to give you the key learnings from it. I've attached a few resources here: 1. A Loom video that walks you through how I did it, plus some tips about how to get around some of Claude's quirks. 2. A link to the Google Sheet that I'm using, which you can copy for yourself. 3. The JSON file, which contains my entire n8n automation. What I would recommend doing is dropping that JSON file into Claude after you watch the video and having it tell you how to build it. It should be set up to run, but if you're not using the exact same parameters as I am, you might have to tweak it a little bit. I hope this helps everybody. Best of luck, and if you have any questions, absolutely feel free to ask! Video walkthrough: https://www.loom.com/share/6dabb1b70d594a64ae5273a9f4b9dcbe Google Sheet: https://docs.google.com/spreadsheets/d/1FvtD_6i5t1ePdYnkZLJJgFgX2RML12UNrZlT9iXjDYA/edit?usp=sharing My n8n automation (attached JSON)
Pulling YouTube transcripts for research
0 likes • 21d
Bookmark the look link
🌱 April is here. What are you building this month?
New month, new window. April is a reminder that progress does not come from waiting for the perfect moment. It comes from choosing one thing, committing to it, and building momentum before doubt has a chance to slow us down. This is the month to stop overthinking and start moving. What are we building? A better system? A new offer? Stronger habits? More confidence with AI? More time back in the week? Whatever it is, April is an opportunity to create real traction, not just more intentions. The biggest wins rarely come from doing everything at once. They come from picking a clear goal and working it consistently. One focused month can change a lot. It can cut cycle time, reduce procrastination, sharpen skills, and create the kind of progress that compounds fast. So let’s make this month count. Build the workflow. Launch the idea. Learn the tool. Finish the draft. Start the project. Protect the time. Use April to create something future you will thank you for. No drifting. No waiting. No playing small. April is here. What are you building this month? Comment your April goal.
4 likes • 28d
I'm building a fully agentic-run business. I have the full business framework setup - took me a few days to do it. I did the whole thing in Claude. Now I'm building my Content Researcher and Content Creation agents. These will automatically scrape LinkedIn and X in my niche, find the top accounts and log both their most viral posts and their post types. then the content creation agents will duplicate these posts for me, in my voice, log for my approval and then post them via scheduling.
1-4 of 4
Pete Davis
3
40points to level up
@peter-davis-3284
I build AI trading systems and AI automated companies. On a mission to yreplace myself 🤣

Active 3m ago
Joined Mar 30, 2026
Boston, MA
Powered by