HIGH api rate abusehapimongodb

Api Rate Abuse in Hapi with Mongodb

Api Rate Abuse in Hapi with Mongodb — how this specific combination creates or exposes the vulnerability

Rate abuse in a Hapi API backed by MongoDB typically occurs when an endpoint that queries or writes to MongoDB lacks effective request limiting. Without rate controls, an attacker can send a high volume of requests that drive intensive database operations. This can manifest as a high number of queries on user-supplied identifiers, heavy aggregation pipelines, or numerous write operations, all targeting MongoDB. For example, an endpoint like /users/{id} that performs a findOne on MongoDB may be called repeatedly to enumerate valid user IDs or exhaust server-side resources. Similarly, endpoints that perform writes can be abused to create many documents, increasing storage and I/O load. Because Hapi does not enforce request-rate limits by default, such abuse is possible at the application layer, and MongoDB may show elevated CPU, memory, and connection usage as a result. These patterns are often seen in BFLA/Privilege Escalation and Rate Limiting checks in security scans, where excessive unthrottled database interaction is flagged.

Mongodb-Specific Remediation in Hapi — concrete code fixes

To mitigate rate abuse involving MongoDB in Hapi, apply rate limiting at the route level and optimize database interactions. Use a token-bucket or sliding-window strategy with a shared store so that limits are enforced consistently across instances. For MongoDB operations, ensure queries are efficient, use indexes, and avoid unbounded operations. Below are concrete examples.

1. Rate limiting with Hoek in Hapi

Use the built-in @hapi/hoek utilities or a plugin to implement rate limiting. The following example uses a simple in-memory rate limiter; for distributed systems, replace the cache with Redis or another shared store.

const Hapi = require('@hapi/hapi');
const Hoek = require('@hapi/hoek');

const rateLimits = new Map();

function rateLimit(request, h) {
  const key = request.info.remoteAddress + ':' + request.path;
  const now = Date.now();
  const windowMs = 60 * 1000; // 1 minute
  const max = 30; // max requests per window

  if (!rateLimits.has(key)) {
    rateLimits.set(key, { count: 1, start: now });
  } else {
    const entry = rateLimits.get(key);
    if (now - entry.start < windowMs) {
      entry.count += 1;
    } else {
      rateLimits.set(key, { count: 1, start: now });
    }
  }

  const entry = rateLimits.get(key);
  if (entry.count > max) {
    throw Boom.tooManyRequests('Too many requests, please try again later.');
  }
  return h.continue;
}

const server = Hapi.server({ port: 4000 });
server.ext('onPreResponse', (request, h) => {
  // optional custom handling
  return h.continue;
});

server.route({
  method: 'GET',
  path: '/users/{id}',
  options: {
    pre: [{ method: rateLimit, assign: 'rateLimit' }]
  },
  handler: async (request, h) => {
    const { id } = request.params;
    const client = require('./db'); // MongoDB client
    const user = await client.db('test').collection('users').findOne({ _id: id });
    return user || { message: 'Not found' };
  }
});

server.start().then(() => console.log('Server running'));

2. Secure MongoDB usage in Hapi handlers

Use parameterized queries and projection to limit returned fields. Ensure indexes exist on queried fields to prevent collection scans that can be abused to consume resources.

const { MongoClient } = require('mongodb');
const uri = 'mongodb://localhost:27017';
const client = new MongoClient(uri, { maxPoolSize: 10, serverSelectionTimeoutMS: 5000 });

async function getUserById(id) {
  await client.connect();
  const db = client.db('test');
  // Use indexed field, limit projection, and enforce a reasonable limit
  const user = await db.collection('users')
    .findOne({ _id: id }, { projection: { name: 1, email: 1, _id: 1 } });
  return user;
}

// Example route using the safe function
server.route({
  method: 'GET',
  path: '/profile/{id}',
  handler: async (request, h) => {
    const { id } = request.params;
    try {
      const user = await getUserById(id);
      return user || { message: 'Not found' };
    } catch (err) {
      return { error: 'Internal error' };
    }
  }
});

3. Combine with route-level throttling and query constraints

For write-heavy endpoints, validate payloads and enforce size/complexity limits. Use MongoDB’s insertMany with ordered:false carefully, and apply application-level caps on array sizes to prevent resource exhaustion.

server.route({
  method: 'POST',
  path: '/messages',
  options: {
    pre: [{ method: rateLimit, assign: 'rateLimit' }]
  },
  handler: async (request, h) => {
    const { messages } = request.payload; // Expect an array
    if (!Array.isArray(messages) || messages.length > 10) {
      throw Boom.badRequest('Messages array is required and limited to 10 items.');
    }
    const client = require('./db');
    await client.connect();
    // Use ordered:false cautiously; ensure you handle errors per document
    const result = await client.db('test').collection('messages').insertMany(
      messages.map(text => ({ text, createdAt: new Date() })),
      { ordered: false }
    );
    return { insertedIds: result.insertedIds };
  }
});

These patterns reduce the surface for rate abuse by combining request throttling with efficient, bounded MongoDB operations. They align with common findings in security scans that highlight missing rate limits and inefficient database queries as risk factors.

Frequently Asked Questions

Can rate limiting alone fully prevent MongoDB abuse in Hapi?
Rate limiting reduces abuse risk but should be paired with efficient queries, indexing, and payload validation. Use shared-rate stores for distributed setups and monitor database metrics.
How do I integrate these patterns with my existing Hapi routes?
Wrap route handlers with a pre-processing method like the rateLimit example above; centralize MongoDB client creation and ensure all queries use indexed fields and projections.