Camkode
Camkode

Rate Limiting in Node.js APIs with Redis

Posted by Kosal

When building APIs in Node.js, ensuring reliability and protecting resources from abuse is critical. One of the most effective techniques to achieve this is rate limiting—restricting the number of requests a user can make to your API in a given time frame. In this article, we’ll explore how to implement rate limiting in a Node.js API using Redis for scalable and efficient tracking.

Why Rate Limiting?

Rate limiting is essential for:

  • Preventing DDoS attacks: Protect your server from being overwhelmed by too many requests.
  • Enforcing fair usage: Ensure that no single user consumes disproportionate resources.
  • Cost control: Avoid overuse of third-party APIs or services that charge per request.
  • Improved performance: Keep API response times predictable and stable.

Technologies Used

  • Node.js (Express) – For creating the API.
  • Redis – For fast, in-memory tracking of request counts.
  • Express-rate-limit – Middleware for simple rate limiting.
  • Rate-limit-redis – Redis store adapter for express-rate-limit.

🔧 Installation

npm install express express-rate-limit rate-limit-redis redis

Make sure you have a Redis server running locally or remotely.

Basic Setup

const express = require('express');
const rateLimit = require('express-rate-limit');
const RedisStore = require('rate-limit-redis');
const { createClient } = require('redis');

const app = express();
const port = 3000;

// Initialize Redis client
const redisClient = createClient({ legacyMode: true });
redisClient.connect().catch(console.error);

// Apply rate limiting
const limiter = rateLimit({
  windowMs: 15 * 60 * 1000, // 15 minutes
  max: 100, // Limit each IP to 100 requests per windowMs
  standardHeaders: true,
  legacyHeaders: false,
  store: new RedisStore({
    sendCommand: (...args) => redisClient.sendCommand(args),
  }),
});

// Apply limiter to all requests
app.use(limiter);

app.get('/', (req, res) => {
  res.send('Hello! You are accessing a rate-limited API.');
});

app.listen(port, () => {
  console.log(`Server running at http://localhost:${port}`);
});

How It Works

  1. Redis as a Store: Redis keeps a counter for each IP address with a TTL (Time to Live) matching the rate limit window.
  2. Fast Performance: Redis operates in memory, making it ideal for low-latency use cases like rate limiting.
  3. Shared Across Servers: Unlike in-memory limits, Redis allows you to rate limit across multiple Node.js instances.

Things to Consider

  • User Authentication: For authenticated users, consider using user ID instead of IP address as the key.
  • Abuse Handling: Customize the handler function in express-rate-limit to return a meaningful message or log abuse attempts.
  • Distributed Environments: Redis is essential when deploying on multiple nodes (e.g., in Kubernetes or with load balancers).

Example Custom Handler

const limiter = rateLimit({
  windowMs: 1 * 60 * 1000,
  max: 10,
  handler: (req, res) => {
    res.status(429).json({
      status: 'error',
      message: 'Too many requests. Please try again later.',
    });
  },
  store: new RedisStore({
    sendCommand: (...args) => redisClient.sendCommand(args),
  }),
});

Conclusion

Rate limiting is a crucial layer of defense in any public-facing API. By leveraging Redis with Node.js, you gain a scalable, high-performance solution that works seamlessly even in distributed environments. Whether you're protecting sensitive endpoints or simply encouraging fair usage, Redis-backed rate limiting is a best practice worth implementing.

Reading Document