HIGH cache poisoningrestifybearer tokens

Cache Poisoning in Restify with Bearer Tokens

Cache Poisoning in Restify with Bearer Tokens — how this specific combination creates or exposes the vulnerability

Cache poisoning occurs when an attacker tricks a caching layer into storing a malicious response that is served to other users. In Restify, when endpoints that include per-user authorization data—such as Bearer Tokens in the Authorization header—are cached without proper normalization, the cached response can be reused across different users or roles. This typically happens when the cache key does not exclude or differentiate the Authorization header, causing one user’s protected resource to be served to another user who should not have access.

For example, consider a Restify server that caches user profile responses. If the cache key is based only on the request path (e.g., /profile) and does not include the value of the Authorization header, requests from different users with different Bearer Tokens will map to the same cache entry. User A’s request with Authorization: Bearer token_A may be cached, and subsequently, User B’s request with Authorization: Bearer token_B receives the same cached response containing User A’s data. This violates access controls and can lead to unauthorized information disclosure, a common vector in BOLA/IDOR scenarios.

Additionally, if the upstream service varies response content based on token scope or claims (e.g., roles embedded in the token), a single cached response might inadvertently expose administrative endpoints or sensitive fields to non-privileged users. Because the cache operates at the HTTP layer, it does not understand the semantic meaning of Bearer Tokens; it only sees the raw request. Unless the Authorization header is explicitly excluded from the cache key or the response is scoped per-authorization context, the risk of cache poisoning remains. This issue is especially relevant when responses include sensitive data or when the endpoint interacts with backends that do not re-evaluate authorization on each request due to caching.

In practice, this vulnerability aligns with the OWASP API Top 10 category of Broken Object Level Authorization (BOLA) and can be identified by scanning tools like middleBrick, which tests unauthenticated attack surfaces and maps findings to compliance frameworks. A scan may reveal that an endpoint returns different status codes or response bodies based on the Authorization header but still uses a shared cache key, indicating a potential misconfiguration. Proper remediation requires ensuring that any caching mechanism respects authorization context and does not allow cross-user response reuse.

Bearer Tokens-Specific Remediation in Restify — concrete code fixes

To mitigate cache poisoning when using Bearer Tokens in Restify, you must ensure that the cache key incorporates the Authorization header or that authenticated responses are never cached. The following approaches demonstrate how to configure Restify to handle Bearer Tokens safely.

Approach 1: Exclude Authorization header from cache key

If you control the caching layer (for example, an HTTP proxy or a custom cache middleware), ensure the cache key excludes the Authorization header. In a Restify plugin, you can normalize the request before caching by removing or hashing the Authorization header:

const restify = require('restify');
const server = restify.createServer();

server.use((req, res, next) => {
  // Create a cache-safe version of the request for key generation
  const cacheKey = req.url;
  if (req.headers.authorization) {
    // Optionally, hash the token if you need to differentiate by user without exposing raw token in logs
    const tokenHash = require('crypto').createHash('sha256').update(req.headers.authorization).digest('hex');
    req.cacheKey = `${cacheKey}:user=${tokenHash.substring(0, 8)}`;
  } else {
    req.cacheKey = cacheKey;
  }
  return next();
});

// Example route that should not cache per-user data
server.get('/profile', (req, res, next) => {
  // Your logic here — do not rely on shared cache for user-specific data
  res.send({ user: 'profile-data' });
  return next();
});

server.listen(8080, () => {
  console.log('Server listening on port 8080');
});

This ensures that even if two requests share the same path, their cache keys differ when Bearer Tokens are present, preventing cross-user response reuse.

Approach 2: Disable caching for authenticated endpoints

For endpoints that return user-specific data, explicitly instruct caches not to store responses. You can set headers that downstream caches (like CDNs or reverse proxies) respect:

const restify = require('restify');
const server = restify.createServer();

server.use(restify.plugins.conditionalRequest());

server.get('/admin', (req, res, next) => {
  // Disable caching for authenticated requests
  if (req.headers.authorization) {
    res.setHeader('Cache-Control', 'no-store, no-cache, must-revalidate, proxy-revalidate');
    res.setHeader('Pragma', 'no-cache');
    res.setHeader('Expires', '0');
  }
  res.send({ admin: 'restricted-data' });
  return next();
});

server.listen(8080, () => {
  console.log('Admin endpoint configured to avoid caching with Bearer Tokens');
});

By setting these headers, you reduce the likelihood that a cached response will be served to another user who happens to present a different Bearer Token.

Approach 3: Scope caching by token claims

If you use JWTs and your cache layer can parse them, include specific claims (such as sub or scope) in the cache key. This allows caching at the user level where appropriate, while still preventing cross-user pollution:

const jwt = require('jsonwebtoken');

server.use((req, res, next) =>
  if (req.headers.authorization && req.headers.authorization.startsWith('Bearer ')) {
    const token = req.headers.authorization.substring(7);
    try {
      const decoded = jwt.verify(token, 'your_jwk_or_secret');
      req.cacheKey = `${req.url}:user=${decoded.sub}:scope=${decoded.scope.join(',')}`;
    } catch (err) {
      req.cacheKey = req.url;
    }
  } else {
    req.cacheKey = req.url;
  }
  return next();
);

This method is more granular and can be safe if tokens are short-lived and validation is performed securely. However, it still requires that the cache never serves a response generated under one Authorization context to another.

Regardless of the approach, always validate that your caching strategy does not inadvertently expose data across users. middleBrick can help identify missing header normalization in cache keys by running its unauthenticated scan and reviewing the findings related to data exposure and BOLA. For production systems, combine these code-level fixes with regular scans and compliance mapping to OWASP API Top 10 and relevant frameworks.

Frequently Asked Questions

How can I test if my Restify endpoint is vulnerable to cache poisoning with Bearer Tokens?
You can test by sending requests with different Bearer Tokens and observing whether responses are shared. Use a tool to inspect cache behavior or run a scan with middleBrick, which checks for missing authorization normalization in cache keys and maps findings to OWASP API Top 10.
Does middleBrick fix cache poisoning vulnerabilities in Restify?
middleBrick detects and reports cache poisoning risks, including issues with Bearer Tokens in Restify, providing remediation guidance. It does not automatically fix or patch the endpoint; developers must apply the recommended code changes.