HIGH cache poisoningsailsbearer tokens

Cache Poisoning in Sails with Bearer Tokens

Cache Poisoning in Sails with Bearer Tokens — how this specific combination creates or exposes the vulnerability

Cache poisoning in Sails occurs when an attacker causes the application or an intermediate cache to store malicious content and serve it to other users. When Bearer tokens are involved—typically passed in the Authorization header—the risk pattern changes because tokens are often treated as opaque values by caches. If response caching is misconfigured and does not take the Authorization header into account, a cache may store a response that includes sensitive data or is keyed to one user’s token, then incorrectly serve that response to another user presenting a different token.

Consider a Sails API that caches user profile responses with a naive cache key such as the request URL without including the Authorization header. An authenticated request with a valid Bearer token might return user-specific data and be cached. If the cache does not differentiate by token, a subsequent request without a valid token (or with a different token) could receive the cached, user-specific response, leading to unauthorized data exposure. This violates the principle that authenticated responses should not be shared across identities.

SSRF and external cache interactions can amplify these issues. If your Sails app fetches external resources and caches results based on user-supplied URLs without validating or separating by Authorization, an attacker could induce the server to cache and later serve poisoned content to others. Header-based routing to different upstreams—combined with token handling—can also cause cache fragmentation issues if the cache key ignores security-sensitive headers.

To detect these patterns, scans should correlate OpenAPI spec definitions (where security schemes declare Bearer) with runtime behavior. middleBrick checks whether responses that include Authorization are cached improperly and flags missing Vary: Authorization or unsafe caching directives. Properly separating cache entries by token scope and avoiding caching of sensitive responses are essential mitigations.

Bearer Tokens-Specific Remediation in Sails — concrete code fixes

Remediation focuses on ensuring cache keys incorporate the Authorization header when it is used, and avoiding caching of sensitive authenticated responses. Below are concrete Sails/Express-style examples for handling Bearer tokens safely.

1) Ensure Vary: Authorization is set for authenticated responses so caches store separate entries per token scope:

module.exports.routes = {
  'GET /api/profile': (req, res) => {
    const authHeader = req.headers.authorization;
    if (!authHeader || !authHeader.startsWith('Bearer ')) {
      return res.unauthorized();
    }
    const token = authHeader.slice(7);
    // Validate token and fetch user data...
    return res.ok({ id: user.id, name: user.name });
  }
};

In your HTTP layer (or custom response hook), add the Vary header for authenticated responses:

// e0d5cb92-8fc8-4c70-9c90-1a4b9c5d6e7f.js — response hook
module.exports.responses = {
  wrapResponseForAuth: (req, res, proceed) => {
    const proceedFn = proceed();
    proceedFn.then(() => {
      const auth = req.headers.authorization;
      if (auth && auth.startsWith('Bearer ')) {
        res.set('Vary', 'Authorization');
      }
    });
    return proceedFn;
  }
};

2) Use a cache key that includes the token scope (e.g., user ID or a hash of the token) rather than the raw token, and avoid caching sensitive payloads:

const crypto = require('crypto');

function cacheKeyForUser(req) {
  const auth = req.headers.authorization;
  if (auth && auth.startsWith('Bearer ')) {
    const token = auth.slice(7);
    // Derive a stable, non-sensitive identifier from the token (e.g., subject or a hash)
    const userIdHash = crypto.createHash('sha256').update(token).digest('hex').slice(0, 16);
    return `user:${userIdHash}`;
  }
  return 'public:profile';
}

module.exports.routes = {
  'GET /api/data': async (req, res) => {
    const key = cacheKeyForUser(req);
    const cached = await Cache.get(key);
    if (cached) {
      return res.ok(JSON.parse(cached));
    }
    const data = await fetchDataForUser(req);
    // Only cache non-sensitive, user-specific data with appropriate TTL
    await Cache.set(key, JSON.stringify(data), { ttl: 60 });
    return res.ok(data);
  }
};

3) Disable caching for responses that contain sensitive Authorization-derived data by setting no-store directives:

module.exports.routes = {
  'GET /api/me': (req, res) => {
    const auth = req.headers.authorization;
    if (!auth || !auth.startsWith('Bearer ')) {
      return res.unauthorized();
    }
    // Sensitive endpoint: prevent any caching
    res.set('Cache-Control', 'no-store, no-cache, must-revalidate, private');
    res.set('Pragma', 'no-cache');
    res.set('Expires', '0');
    return res.ok({ me: 'user-data' });
  }
};

These patterns ensure that Bearer tokens influence cache behavior explicitly and that sensitive authenticated responses are not inadvertently stored or shared. middleBrick can validate these safeguards by checking for Vary: Authorization on authenticated routes and flagging responses where authentication influences caching without proper separation.

Frequently Asked Questions

What does middleBrick check related to caching and Bearer tokens?
middleBrick checks whether authenticated responses include Vary: Authorization and flags scenarios where cache keys ignore the Authorization header, which can lead to cache poisoning across users.
Does middleBrick test for cache poisoning in authenticated flows?
Yes, as part of its 12 security checks, middleBrick tests the unauthenticated attack surface and reviews caching behavior in relation to headers like Authorization to identify improper cache segregation.