HIGH api rate abusehapibasic auth

Api Rate Abuse in Hapi with Basic Auth

Api Rate Abuse in Hapi with Basic Auth — how this combination creates or exposes the vulnerability

Rate abuse in Hapi when using HTTP Basic Authentication occurs because authentication and request throttling are implemented at different layers and may not be tightly coordinated. With Basic Auth, credentials are sent in the Authorization header on every request. If rate limiting is applied only after successful authentication or is scoped to authenticated identities, an unauthenticated or low-cost attacker can still flood the endpoint to enumerate valid usernames or exhaust server-side connection and compute resources.

In a black-box scan, middleBrick checks whether rate limiting applies to unauthenticated paths and whether the same limits are enforced for authenticated and unauthenticated requests. Hapi servers that define routes with auth: false or rely on built-in route caching without explicit rate limiting may expose a larger unauthenticated attack surface. For example, an endpoint that validates credentials but does not enforce per-IP or per-user limits before processing can be targeted with rapid credential trials or simple GET floods, leading to denial of service or user enumeration.

Consider a Hapi route that accepts Basic Auth but does not scope rate limits by user or by IP:

const Hapi = require('@hapi/hapi');
const auth = require('basic-auth');

const server = Hapi.server({ port: 4000 });

server.route({
  method: 'GET',
  path: '/api/account',
  options: {
    auth: false,
    handler: (request, h) => {
      const credentials = auth(request);
      if (!credentials || credentials.name !== 'admin' || credentials.pass !== 'secret') {
        return h.response('Unauthorized').code(401);
      }
      return { data: 'sensitive' };
    }
  }
});

await server.start();

In this example, there is no validation of request rate before authentication logic runs. An attacker can send many requests per second with arbitrary credentials, consuming CPU and connection resources. Even if the handler returns 401 quickly, the absence of pre-authentication rate limiting enables account enumeration if responses differ slightly between valid and invalid users.

middleBrick’s 12 security checks include Rate Limiting and Authentication. When scanning an API with Basic Auth, it verifies whether rate limits are enforced for unauthenticated requests and whether limits vary by identity or IP. Findings typically highlight missing pre-auth throttling and inconsistent limits across authenticated and unauthenticated paths, which can contribute to user enumeration or denial-of-service conditions.

To reduce risk, enforce rate limits before authentication logic and align limits across authenticated and unauthenticated paths. This minimizes the attack window for both resource exhaustion and username enumeration. Use server-wide or route-level throttling that applies as early as possible in the request lifecycle, and ensure that responses for rate-limited requests remain consistent to avoid leaking account validity.

Basic Auth-Specific Remediation in Hapi — concrete code fixes

Remediation focuses on two areas: applying rate limits before authentication checks and ensuring consistent, minimal information in responses. In Hapi, you can use the built-in rate limiting extensibility or an external policy to enforce request counts per IP or per credential keyed by username or a hashed identifier. Limits should be applied regardless of authentication outcome, and 401 responses should not reveal whether a username exists.

Below is a hardened route example that applies a shared rate limit for all incoming requests using a simple in-memory token bucket stored in a shared map. The rate limit is evaluated before validating credentials, and identical error responses are returned for both rate-limited and unauthorized requests to avoid user enumeration.

const Hapi = require('@hapi/hapi');
const auth = require('basic-auth');

const server = Hapi.server({ port: 4000 });

// Simple in-memory rate limiter: max N requests per window per key
const createRateLimiter = (maxRequests = 60, windowMs = 60 * 1000) => {
  const limits = new Map(); // key -> { count, resetAt }
  return (key) => {
    const now = Date.now();
    const record = limits.get(key);
    if (!record) {
      limits.set(key, { count: 1, resetAt: now + windowMs });
      return { allowed: true, resetAt: record?.resetAt || now + windowMs };
    }
    if (now > record.resetAt) {
      record.count = 1;
      record.resetAt = now + windowMs;
      return { allowed: true, resetAt: record.resetAt };
    }
    if (record.count >= maxRequests) {
      return { allowed: false, resetAt: record.resetAt };
    }
    record.count += 1;
    return { allowed: true, resetAt: record.resetAt };
  };
};

const rateLimiter = createRateLimiter(60, 60 * 1000);

server.route({
  method: 'GET',
  path: '/api/account',
  options: {
    auth: false,
    handler: (request, h) => {
      const key = request.info.remoteAddress;
      const result = rateLimiter(key);
      if (!result.allowed) {
        return h.response('Too Many Requests').code(429);
      }
      const credentials = auth(request);
      // Use constant-time comparison in production to avoid timing leaks
      if (!credentials || credentials.name !== 'admin' || credentials.pass !== 'secret') {
        return h.response('Unauthorized').code(401);
      }
      return { data: 'sensitive' };
    }
  }
});

// Apply a global pre-auth hook to enforce site-wide limits (example)
server.ext('onPreHandler', (request, h) => {
  const ipKey = request.info.remoteAddress;
  const res = rateLimiter(ipKey);
  if (!res.allowed) {
    return h.response('Too Many Requests').code(429);
  }
  return h.continue;
});

await server.start();

This approach ensures that every request is counted before authentication processing, reducing resource exhaustion and enumeration risks. For production, replace the in-memory map with a shared store such as Redis to coordinate limits across instances. Also prefer constant-time comparison for credentials and avoid returning detailed errors that differ between unauthorized and rate-limited states.

middleBrick’s GitHub Action can be added to CI/CD pipelines to fail builds if the API’s risk score drops below your configured threshold, helping you catch regressions in authentication and rate-limiting configurations before deployment.

Frequently Asked Questions

Does Basic Auth over HTTPS still need explicit rate limiting in Hapi?
Yes. HTTPS protects confidentiality and integrity in transit but does not prevent abuse such as flooding the server or user enumeration. You still need explicit rate limits applied before authentication checks regardless of transport encryption.
Can middleBrick detect missing rate limits for Basic Auth endpoints?
Yes. middleBrick’s Rate Limiting and Authentication checks identify whether pre-authentication rate limits are present and whether responses differ in ways that could enable enumeration. The scan runs in 5–15 seconds and returns prioritized findings with remediation guidance.