TimeSlipSearch: A Conversational Time Machine for Pop Culture
Elizabeth Stein

Elizabeth Stein @liztacular

Location:
Washington, Pennsylvania
Joined:
Nov 6, 2024

TimeSlipSearch: A Conversational Time Machine for Pop Culture

Publish Date: Feb 9
9 1

This is my submission for the DEV Challenge: Consumer-Facing Conversational Experiences.

What I Built

TimeSlipSearch is a conversational time machine that answers questions like:

“What was the #1 song the day I was born?”

…in under 100 milliseconds.

Type a date in plain English, like “Summer of ’69,” “Christmas 1985,” or “the day the Berlin Wall fell,” and instantly receive a complete cultural snapshot:

  • Billboard Hot 100 chart results
  • Movies in theaters
  • Gas prices and other economic context
  • Historical events from that exact moment in time

The Problem

Nostalgia is a $260B+ industry, yet exploring historical pop culture still requires jumping between Wikipedia, Billboard archives, IMDb, and economic databases.

The data exists, it’s just scattered and hard to access conversationally.

The Solution

A single unified search across 420,000+ indexed records, wrapped in an immersive VHS/CRT retro interface that makes time travel feel real.

Demo

🔗 Live: https://timeslipsearch.vercel.app

Try these queries:

  • July 20, 1969 — Moon landing day
  • Summer of ’69 — Natural language works
  • Compare 1989 vs 1979 — Side-by-side decades
  • Your birthday

How I Used Algolia Agent Studio

Indexed Data: 420K+ Records Across 4 Indices

Index Records What It Contains
timeslip_songs 350,000 Every Billboard Hot 100 entry, 1958–2020
timeslip_movies 50,000 Theatrical releases from TMDB
timeslip_prices 900 Gas, minimum wage, movie tickets (FRED)
timeslip_events 20,000 Historical events (Wikimedia)

Retrieval Strategy: Parallel Multi-Index Search

Every user query triggers one HTTP request that searches all four indices simultaneously:

const response = await client.search({
  requests: [
    { indexName: "timeslip_songs",  filters: `date >= ${start} AND date <= ${end}`, hitsPerPage: 10 },
    { indexName: "timeslip_movies", filters: `date >= ${start} AND date <= ${end}`, hitsPerPage: 5 },
    { indexName: "timeslip_prices", filters: `date >= ${start} AND date <= ${end}`, hitsPerPage: 1 },
    { indexName: "timeslip_events", filters: `date >= ${start} AND date <= ${end}`, hitsPerPage: 5 }
  ]
});
Enter fullscreen mode Exit fullscreen mode

This batched approach is critical, sequential queries would take ~4× longer and break the conversational feel.

Prompt Engineering: Era-Aware Context Generation

Raw search results aren’t enough for conversation. I built a cultural context layer that enriches responses with era-specific narratives:

const YEAR_HIGHLIGHTS: Record<number, string> = {
  1969: "The Summer of Love peaked as humans walked on the moon.",
  1984: "MTV transformed music into a visual medium.",
  1989: "The Berlin Wall fell and hip-hop went mainstream."
  // 60+ curated year narratives
};
Enter fullscreen mode Exit fullscreen mode

The agent also generates contextual follow-up suggestions based on actual results:

  • Found George Michael? → “Explore more from George Michael”
  • Searched 1988? → “See nearby: 1989 — The year the Berlin Wall fell”
  • First time in the 80s? → “Discover more of the 80s”

Memory System: Persistence Without Accounts

Following a retrieval + scale + memory approach, I implemented localStorage-based memory:

  • Search History — last 20 queries with one-click replay
  • Favorites — save meaningful dates with personal notes
  • Achievements — unlock badges for exploring different decades

This creates session continuity without requiring authentication, the agent “remembers” your journey through time.


Why Fast Retrieval Matters

TimeSlipSearch lives or dies by speed. Here’s why Algolia was essential:

1) Conversational UX requires instant responses

Chat interfaces create expectations of immediacy. A 2-second delay feels like the agent is “thinking too hard.” Algolia’s sub-100ms retrieval keeps the conversation flowing naturally.

2) Four indices, one round-trip

Without batched multi-index search, I’d need 4 sequential API calls. At ~150ms each, that’s ~600ms of network latency alone, before any processing. Algolia collapses this to a single request.

3) The retro aesthetic is decoration, not necessity

The VHS tracking lines and CRT glow are purely stylistic. Results arrive so fast that the “loading” animation is optional, users see their time capsule before the tape even finishes rewinding.

4) Numeric range filtering at scale

Searching 350,000 Billboard records by Unix timestamp range could be expensive. Algolia’s numeric filters handle it effortlessly, enabling queries like “show me everything from June 1–7, 1988” without performance degradation.


Built With

  • Next.js 16
  • Algolia v5
  • TypeScript
  • Tailwind CSS
  • chrono-node for natural language date parsing

Data Sources

  • Billboard Hot 100
  • TMDB
  • FRED Economic Data
  • Wikimedia

Comments 1 total

  • Justin Elliott
    Justin ElliottFeb 9, 2026

    This is a great example of how speed directly shapes UX. The sub-100ms target and batched multi-index search make a lot of sense for keeping the conversation fluid. I also like the decision to keep memory client-side instead of forcing authentication.

Add comment