Every second of delay means a drop in engagement, conversions, and search ranking .
That’s exactly what I faced: a React-based web app taking 5 seconds to load. After a full performance overhaul, I brought it down to ~500ms , and no — it wasn’t just about lazy loading.
In this post, I’ll walk you through real-world optimizations backed by metrics , with code snippets that go beyond the basics.
🧪 Step 1: Measure Like a Pro — Not Just Lighthouse
Most devs run Lighthouse once and call it a day. But real optimization starts with real user data and repeatable insights.
✅ Tools I Used:
- Lighthouse CI : for running automated perf tests in CI/CD
- WebPageTest.org : for multi-device, multi-region testing
- Real User Monitoring (RUM) with web-vitals library
- Chrome Performance Panel : for flame charts & render-blocking scripts
🎯 Now I had field data , not just lab data. That changed everything.
🔥 Step 2: Replace Lazy Loading with Code-Splitting by Priority
Yes, lazy loading is basic. But granular route and component priority loading is smarter.
✅ What I Did:
- Split bundle using vite-plugin-pages or webpack splitChunks
- Marked above-the-fold components as critical
- Loaded low-priority routes/modules using requestIdleCallback
💡 Example:
if (‘requestIdleCallback’ in window) {
requestIdleCallback(() => import(‘./widgets/NewsletterPopup.js’));
}
🔥 Bonus:
Used PreloadWebpackPlugin to preload next likely routes for better perceived speed.
🧠 Step 3: Prioritize Critical CSS with SSR + Streaming
One of the biggest blockers? CSS. Especially when using large UI frameworks like Tailwind or Material UI.
✅ What I Did:
- Used critical CSS extraction with critters (Next.js handles this well too)
- Streamed HTML using React 18 Server Components
- Removed unused CSS with PurgeCSS
💡 React 18 Streaming SSR (Next.js-like setup):
import { renderToPipeableStream } from ‘react-dom/server’;
renderToPipeableStream(, {
onShellReady() {
stream.pipe(res);
},
});
Result : HTML and CSS started rendering in <200ms.
🚀 Step 4: Prefetching with Prediction (AI or Manual)
Preloading assets is good. Predicting what to preload next? Even better.
✅ What I Did:
- Used Quicklink to auto-prefetch links in the viewport
- Built a simple user flow predictor with localStorage
💡 Quicklink Usage:
quicklink.listen();
💥 Result:
TTI dropped for navigations from 1.2s → 200ms on mobile.
📦 Step 5: Server & Infra-Level Performance Wins
Sometimes the slow page isn’t the front end — it’s the backend or network.
✅ What I Did:
- Used brotli compression instead of gzip (Content-Encoding: br)
- Switched from shared hosting to Edge Function/CDN hybrid (Cloudflare + Netlify Edge)
- Enabled HTTP/3 and early hints (103 Early Hints response)
💡 Example nginx config:
gzip off;
brotli on;
brotli_static on;
✅ These infra changes alone improved TTFB by ~300ms
📉 Final Result: Real Before & After
🚀 Yes — some screens are now loading in ~500ms cold
💰 Server costs dropped thanks to edge caching and reduced payload
🔥 SEO + Core Web Vitals score is consistently 95+
Measure. Prioritize. Act. Repeat.
You don’t need to rebuild your app — just rewire how it loads.
Thanks for reading!
Originally published on Medium, sharing here for the DEV community!