How Node.js Turns API Bottlenecks Into Instant Data
As I keep building Gextron.com, I run into new challenges almost every day.
One big problem was loading an entire option chain (all the calls and puts for a stock).
👉 If you know options, you know there are expirations every single day (zero-day expirations) for the biggest tickers like SPY, QQQ, SPX and it can take 3–4 seconds just to load one full chain. That means the page feels slow for the user.
The Current Setup (Diagram 1)
Right now, the SvelteKit app does all the work:
  1. It asks the vendor (Intrinio) for dividends.
  2. Then it asks for all the expirations.
  3. Then it loops through every expiration to fetch all the strikes with Greeks.
  4. It crunches the numbers (IV, delta, gamma, etc.) and shows them to the user.
  5. To make it faster next time, it saves the result in Redis for 15 minutes and refreshes every 15 minutes.
This works, but it still means a cold start = 3–4 seconds wait.
The Better Setup (Diagram 2)
Instead of making the UI do all the heavy lifting, I introduced workers.
Workers are like little robots running in the background:
  1. An orchestrator job creates a checklist (manifest) of everything we need.
  2. Per-expiration jobs fetch strikes + Greeks in parallel.
  3. While jobs are running, results are saved in staging (like a draft folder).
  4. Once everything is finished, the worker publishes a complete build to Redis.
  5. Redis keeps a current pointer so the UI always knows which build is the latest finished one.
  6. When the user asks for data, the UI just grabs the published build instantly.
No cold start. No half-finished chains. Always fast.
Why it matters
  • Before: UI asks Intrinio directly → wait 3–4 seconds.
  • After: UI only asks Redis → instant load, because workers already did the heavy lifting in the background.
🚀 Pros of Node.js for this task
  1. Event-driven & non-blocking I/O: Perfect for making parallel requests
  2. BullMQ + Redis integrate very clean and can easily spin up queues, workers, and rate limiters without fighting the language
  3. Concurrency without complexity
  4. Workers don't need a heavy process and Node's single thread model means you can run lots of workers on modest hardware
Node.js just felt natural fit for "a lot of async network + job queues + redis caching". Keeps the system fast, simple, and scalable without adding too much complexity.
🔑 Key Takeaway
This isn’t just about options data — it’s about system architecture.
If you want systems to feel fast:
  • Don't do all the work in one big chunk.
  • Split the job into smaller pieces
  • Run those pieces in parallel in the background
  • Only server results when the full set is ready
That's how you go from a slow, 3-4 second page load → to an instant experience.
2
1 comment
Ruben Leija
5
How Node.js Turns API Bottlenecks Into Instant Data
JavaScript
skool.com/javascript
Chat about javascript and javascript related projects. Yes, typescript counts.
Leaderboard (30-day)
Powered by