1. Introduction
As your API grows in popularity, it's essential to protect it from overuse and abuse. One of the simplest and most effective strategies is rate limiting, which controls how many requests a user can make to your API in a given time window.
This tutorial will walk you through:
- What rate limiting is
- Why it's important
- How to implement it in an Express.js application
- Best practices and options for production systems
2. What is Rate Limiting?
Rate limiting sets a restriction on how many requests a client (usually identified by IP address or user token) can make in a given period.
For example:
- Max 100 requests per user per 15 minutes
- Only 10 login attempts per hour
🔒 Why Use Rate Limiting?
✅ Prevents DDoS attacks
✅ Stops brute-force login attempts
✅ Protects backend resources
✅ Creates fair usage among users
3. Setting Up Express Rate Limiting
We'll use the open-source express-rate-limit middleware to set up basic limits.
3.1 Project Setup
mkdir express-rate-limit-demo && cd express-rate-limit-demo
npm init -y
npm install express express-rate-limit dotenv
3.2 Basic Server Example
// app.js
const express = require('express');
const rateLimit = require('express-rate-limit');
const app = express();
const PORT = process.env.PORT || 3000;
// Apply to all requests
const limiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100, // Limit each IP to 100 requests per windowMs
message: 'Too many requests from this IP, please try again later.'
});
app.use(limiter);
app.get('/', (req, res) => {
res.send('Hello, this is a rate-limited API.');
});
app.listen(PORT, () => console.log(`Server running on port ${PORT}`));
4. Advanced Use Cases
4.1 Rate Limit Specific Routes Only
const loginLimiter = rateLimit({
windowMs: 60 * 60 * 1000, // 1 hour
max: 5,
message: 'Too many login attempts. Try again after an hour.'
});
app.post('/login', loginLimiter, (req, res) => {
res.send('Login endpoint.');
});
4.2 Use a Custom Handler
const limiterWithHandler = rateLimit({
windowMs: 15 * 60 * 1000,
max: 10,
handler: (req, res) => {
res.status(429).json({
status: 'fail',
message: 'Rate limit exceeded. Please try again later.'
});
}
});
5. Visualization: Rate Limiting Flow
6. Best Practices
Practice | Reason |
---|---|
Limit sensitive endpoints | Login, signup, and search are attack targets |
Adjust limits per user role | Admins may need higher thresholds |
Use distributed stores (Redis) | For multi-server rate limiting |
Return informative error messages | Helps users understand what's happening |
Log rate-limited requests | Useful for security auditing and analytics |
7. Bonus: Rate Limiting with Redis (for production)
In production, especially with multiple servers or containers, you need to share rate-limit state. This is usually done with Redis.
You can use the package:
npm install rate-limit-redis
Then use it as the store:
const RedisStore = require('rate-limit-redis');
const limiter = rateLimit({
store: new RedisStore({
sendCommand: (...args) => redisClient.call(...args)
}),
windowMs: 60 * 1000,
max: 100
});
8. Conclusion
Rate limiting is a simple but powerful tool for protecting your API. In just a few lines of code, you can dramatically reduce abuse, make your services more stable, and provide a better experience for your users.
You’ve now learned:
- What rate limiting is and why it matters
- How to implement it with express-rate-limit
- Best practices and how to scale with Redis