HIGH api rate abusesailsjwt tokens

Api Rate Abuse in Sails with Jwt Tokens

Api Rate Abuse in Sails with Jwt Tokens — how this specific combination creates or exposes the vulnerability

Rate abuse in Sails when JWT tokens are used arises because authentication and authorization are enforced after the request passes the rate limiter, or because rate limiting is configured per-IP rather than per-token. Sails is a Node.js MVC framework that does not enforce authentication itself; it relies on policies or hooks. If you protect endpoints with JWT validation in a policy but apply rate limiting at the IP level (for example via a reverse proxy or a generic middleware), an attacker can obtain a valid JWT (through phishing, leakage, or a token-reuse flaw) and then exhaust the shared rate budget for that IP. This means the rate limit becomes a shared pool for all users behind the same NAT or for users whose tokens are leaked, enabling credential stuffing, brute force, or denial-of-service against specific accounts or the whole service.

Another scenario is when JWTs embed user identifiers (sub, email, or roles) but the rate limiter does not parse and use those claims. The server may see many requests from the same IP with different valid tokens belonging to different users, yet still allow them because the limiter only tracks IP. This enables an attacker to cycle through stolen tokens to abuse targeted endpoints (for example, password reset, email confirmation, or payment actions) without tripping per-user protections. Additionally, if token issuance endpoints (login/register) are not separately rate-limited, an attacker can flood authentication routes to either harvest tokens via error messages or amplify traffic in a reflection scenario.

Because middleBrick scans the unauthenticated attack surface, it can detect whether rate limiting is applied before authentication, whether token-aware rate limits are missing, and whether token issuance paths lack protection. Findings will highlight missing per-token rate limiting, weak or missing burst controls, and inconsistent enforcement across routes that accept JWTs. These are important because even strong cryptographic tokens do not prevent resource exhaustion when limits are scoped by IP only.

Jwt Tokens-Specific Remediation in Sails — concrete code fixes

To remediate rate abuse with JWT tokens in Sails, enforce rate limits per token or per user identifier after successful authentication, and ensure authentication-sensitive endpoints have their own limits. Below are concrete patterns you can apply.

1. Token-aware rate limiting in a policy

Create a policy that reads the JWT payload and applies a rate limit keyed by a user claim such as sub or email. Using a Redis-backed store is common for distributed rate limiting, but for a single-node example you can use an in-memory map with caution. The key point is to tie limits to the user identity inside the token, not just the IP.

// api/policies/rate-limit-by-user.js
const tokens = new Map();

module.exports.ratelimitByUser = async function (req, res, next) {
  // Expect req.token to be set by an earlier auth hook that verifies JWT
  const user = req.token && req.token.user;
  if (!user) {
    return res.unauthorized('Missing or invalid token');
  }

  const key = `ratelimit:${user.sub}`; // or user.email
  const now = Date.now();
  const windowMs = 60 * 1000; // 1 minute
  const maxRequests = 30;

  if (!tokens.has(key)) {
    tokens.set(key, { count: 1, start: now });
  } else {
    const record = tokens.get(key);
    if (now - record.start > windowMs) {
      tokens.set(key, { count: 1, start: now });
    } else {
      record.count += 1;
    }
  }

  const current = tokens.get(key);
  if (current.count > maxRequests) {
    return res.tooManyRequests('Rate limit exceeded');
  }

  return next();
};

2. Apply the policy selectively

In config/policies.js, apply token-aware rate limiting to sensitive routes, and keep a stricter limit on authentication endpoints.

// config/policies.js
module.exports.policies = {
  '*': ['rateLimitByUser'], // default for most endpoints
  'AuthController.login': ['rateLimitIp', 'rateLimitByUser'], // double limit for auth
  'AuthController.register': ['rateLimitIp', 'rateLimitByUser'],
  'UserController.updateEmail': ['rateLimitByUser']
};

3. JWT verification hook to populate req.token

Before the rate-limit-by-user policy runs, verify the JWT and attach the payload to req.token so the policy can read claims. This keeps concerns separated and works with any JWT library such as jsonwebtoken.

// api/hooks/jwt-auth/index.js
const jwt = require('jsonwebtoken');

module.exports.jwtAuth = function (req, res, next) {
  const auth = req.headers.authorization;
  if (!auth || !auth.startsWith('Bearer ')) {
    return next(); // or return res.unauthorized if you want to fail closed
  }
  const token = auth.slice(7);
  try {
    const payload = jwt.verify(token, process.env.JWT_SECRET);
    req.token = payload; // make claims available to policies
  } catch (err) {
    return res.unauthorized('Invalid token');
  }
  return next();
};

4. Separate limits for authentication routes

Authentication endpoints should have a lower, more conservative limit to prevent enumeration and token-spraying attacks. Combine IP-based and token-based limits where applicable (for login attempts using known user identifiers).

// config/policies.js (excerpt)
module.exports.policies = {
  'AuthController.login': ['rateLimitIp', { type: 'login', max: 5, window: 60 }],
  'AuthController.register': ['rateLimitIp', { type: 'register', max: 3, window: 60 }]
};

By tying rate limiting to the JWT subject and protecting authentication paths independently, you reduce the risk of token sharing abuse and ensure that compromised credentials do not lead to unbounded resource consumption across shared IPs.

Frequently Asked Questions

How does middleBrick detect rate abuse risks when JWT tokens are used?
middleBrick checks whether rate limiting is applied before authentication and whether token-aware limits are present. It flags configurations where limits are IP-only and tokens are not used to scope requests, highlighting missing per-user rate controls.
Should I always enforce rate limits on token issuance endpoints?
Yes. Token issuance routes (login/register) should have their own rate limits to prevent enumeration, token-spraying, and traffic amplification. Combine IP-based and identity-based limits where feasible.