Memory Leak in Express with Bearer Tokens
Memory Leak in Express with Bearer Tokens — how this specific combination creates or exposes the vulnerability
A memory leak in an Express application that uses Bearer tokens typically arises when token metadata (such as decoded payloads, cached validation results, or per-request objects) is retained unintentionally in server-side structures like closures, global maps, or event listeners. Because Bearer tokens are often attached to each request (e.g., via Authorization: Bearer <token>), objects tied to the token or request can accumulate if references are held beyond the request lifecycle. For example, storing decoded token claims in a module-level cache keyed by token string without TTL or cleanup can cause unbounded growth in memory usage over time, especially under sustained traffic. This pattern becomes more likely when token parsing logic is repeated across middleware and route handlers, creating multiple references to similar data.
In an unauthenticated scan, middleBrick tests the attack surface without credentials; however, when Bearer tokens are required for certain endpoints, improper session- or token-state handling can still be observed indirectly through increased memory pressure, slower response times, or eventual process instability. The leak may not directly expose token values, but it can degrade service availability and amplify the impact of token-related logic. Because middleBrick runs 12 security checks in parallel including Authentication and Input Validation, it can surface related anomalies in how token handling interacts with request parsing and resource usage.
Specific real-world patterns that contribute to leaks include attaching parsed token data to req and retaining references in external stores (e.g., a JavaScript Map), failing to remove listeners or callbacks tied to token events, and neglecting to release buffers or large objects created during token verification. These issues align with broader classes such as Improper Resource Management and can intersect with findings from other checks like BFLA or Data Exposure when token context is inadvertently retained.
Bearer Tokens-Specific Remediation in Express — concrete code fixes
Addressing memory leaks related to Bearer tokens in Express involves careful management of request-scoped data, avoiding long-lived references to token payloads, and ensuring cleanup in middleware and route handlers. Below are concrete, realistic examples.
Example 1: Avoid caching token payloads in a module-level Map
Instead of storing decoded tokens indefinitely, use short-lived caches with automatic eviction or avoid caching entirely.
// Unsafe: unbounded Map keyed by token can grow indefinitely
// const tokenCache = new Map();
// Safe: use a scoped variable or a bounded cache with TTL
const cache = new Map();
function getCachedClaims(key) {
const entry = cache.get(key);
if (!entry) return null;
return entry;
}
function setCachedClaims(key, claims) {
// Evict oldest entries when size exceeds a threshold
if (cache.size >= 1000) {
const firstKey = cache.keys().next().value;
cache.delete(firstKey);
}
cache.set(key, claims);
}
// Express route using Bearer token without leaking
app.get('/profile', (req, res) => {
const auth = req.headers.authorization || '';
const token = auth.startsWith('Bearer ') ? auth.slice(7) : null;
if (!token) return res.status(401).json({ error: 'missing_token' });
// Decode and use token without persisting it globally
const claims = decodeToken(token); // e.g., using jsonwebtoken
// Use claims directly; do not attach to a long-lived cache unless necessary
res.json({ user: claims.sub, scope: claims.scope });
});
Example 2: Clean up references in middleware and avoid extending request lifetime
Ensure that per-request objects are not retained after the response ends. Do not attach large buffers or token metadata to req beyond what is strictly needed, and prefer local variables in route handlers.
// Middleware that parses Bearer token but avoids persistent attachment
function parseBearerToken(req, res, next) {
const auth = req.headers.authorization || '';
const token = auth.startsWith('Bearer ') ? auth.slice(7) : null;
if (token) {
// Parse and validate token, but do not store full decoded payload on req
const isValid = validateTokenFormat(token); // lightweight checks
req._tokenValid = isValid; // minimal flag, not full claims
} else {
req._tokenValid = false;
}
next();
}
app.use(parseBearerToken);
app.get('/data', (req, res) => {
if (!req._tokenValid) return res.status(401).json({ error: 'invalid_token' });
// Local variable usage; no long-lived references
const token = req.headers.authorization?.slice(7) || '';
const payload = decodeToken(token);
// Process payload without retaining references on req or global structures
res.json({ data: 'secure-data' });
// Explicitly clear local references if needed (optional in most cases)
// payload is scoped to this function and will be GC'd after response
});
Example 3: Remove event listeners or callbacks tied to token operations
If your application registers listeners based on token events, ensure they are removed when no longer needed to prevent accumulation.
// Example: avoid adding listeners per request
// app.on(`token:${token}`, handler); // Do not do this
// Preferred: use a single shared handler and route by context
function handleTokenEvent(event, token) {
// process event
}
// If dynamic listeners are necessary, ensure removal
let currentListener = null;
function setupTokenListener(token) {
if (currentListener) {
process.off('tokenEvent', currentListener);
}
currentListener = (event) => handleTokenEvent(event, token);
process.on('tokenEvent', currentListener);
}
General practices
- Prefer stateless validation: verify tokens on each request without caching decoded payloads unless necessary, and use bounded caches with TTL if caching is required.
- Limit what you attach to
req; keep request-scoped metadata small and short-lived. - Profile memory usage under load to detect growth patterns; tools like heap snapshots can help identify retained objects related to token handling.