ActiveRecord’s Dark Side: 3 Performance Traps
Alex Aslam

Alex Aslam @alex_aslam

About: Seasoned Software Engineer with 10+ years of experience in software development involving the project management, Team lead, Feature development and happy to collaborate.

Joined:
Dec 23, 2024

ActiveRecord’s Dark Side: 3 Performance Traps

Publish Date: Jul 1
0 0

"Our API was dying under 100 RPS—all because of innocent-looking ActiveRecord calls."

ActiveRecord is like automatic transmission in Rails—smooth, easy, and perfect for most trips. But when you hit the performance highway, you realize it’s been secretly redlining your database while you weren’t looking.

We learned this the hard way when our "optimized" endpoints crumbled under load. Here are the three biggest ActiveRecord traps—and how to escape them.


Trap #1: The N+1 Query Death Spiral

What Happens?

You write a clean query like:

users = User.limit(10)
users.each { |user| puts user.posts.count }
Enter fullscreen mode Exit fullscreen mode

Innocently, Rails fires:

SELECT * FROM users LIMIT 10;
SELECT COUNT(*) FROM posts WHERE user_id = 1;
SELECT COUNT(*) FROM posts WHERE user_id = 2;
-- ...and so on (N+1 queries)
Enter fullscreen mode Exit fullscreen mode

Why It’s Deadly

🚀 10 users → 11 queries
🚀 100 users → 101 queries
🚀 Your database melts

The Fix

# Preload with `includes` (2 queries)
users = User.includes(:posts).limit(10)
users.each { |user| puts user.posts.size } # Uses cached count

# Or use `counter_cache` for counts
# (Add `posts_count` column to users)
Enter fullscreen mode Exit fullscreen mode

Pro Tip:

  • Bullet gem catches N+1s automatically.
  • Strict loading (Rails 6+) raises errors on lazy-loading:
  config.active_record.strict_loading_by_default = true
Enter fullscreen mode Exit fullscreen mode

Trap #2: Memory-Hungry Object Instantiation

What Happens?

You load 10,000 records:

Order.where(status: :completed).each do |order|
  process_order(order)
end
Enter fullscreen mode Exit fullscreen mode

ActiveRecord secretly:

  1. Loads all rows into memory
  2. Instantiates full Ruby objects for each
  3. Garbage collector cries

Why It’s Deadly

📈 10K orders → ~500MB RAM
📈 Slows down iteration (GC overhead)

The Fix

# Use `find_each` (batches of 1000 by default)
Order.where(status: :completed).find_each do |order|
  process_order(order)
end

# For raw speed, use `pluck` + lightweight structs
order_ids = Order.completed.pluck(:id)
order_ids.each_slice(1000) do |ids|
  orders_data = Order.where(id: ids).select(:id, :total)
  orders_data.each { |data| process_order(data) }
end
Enter fullscreen mode Exit fullscreen mode

Bonus:

  • in_batches for even more control.
  • ActiveRecord::Base.connection.execute for pure SQL when you don’t need objects.

Trap #3: The Hidden Join Explosion

What Happens?

You join two tables:

User.joins(:orders).where(orders: { status: :shipped })
Enter fullscreen mode Exit fullscreen mode

ActiveRecord generates:

SELECT users.* FROM users
INNER JOIN orders ON orders.user_id = users.id
WHERE orders.status = 'shipped'
Enter fullscreen mode Exit fullscreen mode

Problem:

  • Returns duplicate User objects if they have multiple orders
  • Wastes memory (instantiated objects)
  • Confuses pagination (LIMIT 10 might return 5 users)

Why It’s Deadly

💥 1 user with 100 orders → 100 duplicate objects
💥 Kills performance on large datasets

The Fix

# Use `distinct`
User.joins(:orders).where(orders: { status: :shipped }).distinct

# Or pre-aggregate with `group`
User.joins(:orders)
    .where(orders: { status: :shipped })
    .group("users.id")
    .select("users.*, COUNT(orders.id) as order_count")

# For complex cases, use a CTE (raw SQL)
User.find_by_sql(<<~SQL)
  WITH shipped_orders AS (
    SELECT DISTINCT user_id FROM orders WHERE status = 'shipped'
  )
  SELECT users.* FROM users
  INNER JOIN shipped_orders ON shipped_orders.user_id = users.id
SQL
Enter fullscreen mode Exit fullscreen mode

The Golden Rule of ActiveRecord

"ActiveRecord is optimized for developer happiness, not database happiness."

When to Use It

CRUD operations
Simple queries
Early-stage apps

When to Avoid It

Batch processing (use find_each or raw SQL)
Complex analytics (CTEs/window functions win)
High-throughput APIs (object instantiation kills speed)


3 Quick Wins Today

  1. Add includes to one N+1 query
  2. Replace .each with .find_each in a background job
  3. Try distinct on a join-heavy query

Got a horror story? Share your biggest ActiveRecord performance surprise below!

Comments 0 total

    Add comment