Cache Poisoning in Feathersjs with Api Keys
Cache Poisoning in Feathersjs with Api Keys — how this specific combination creates or exposes the vulnerability
Cache poisoning in a Feathersjs service occurs when an attacker causes a cached response to be stored under a key that differs per user or per role, leading one user to receive another user’s data. When API keys are used for identification but the caching layer key does not incorporate the key’s associated tenant or user context, a key intended to scope access can be bypassed in cached responses.
Consider a Feathersjs REST API that uses an API key in an Authorization: ApiKey <key> header to identify a tenant or client. If the caching strategy uses only the request path and query string as the cache key and does not include the resolved tenant or API key identifier, two different keys requesting /v1/messages can receive each other’s cached responses. This mis-scoping effectively becomes an Insecure Direct Object Reference (IDOR) vector via cache storage, because the cache key lacks the authorization context provided by the API key.
Additionally, if the API key is accepted as a query parameter (e.g., ?api_key=xxx) and that parameter is included verbatim in the cache key, an attacker who can trick a victim into making a request with their key may indirectly poison the cache for shared keys that do not enforce strict key-to-tenant mapping. Even when API keys are passed in headers, if the application caches per path only and then serves that cached entry to any request sharing that path—regardless of the presenting key—the confidentiality controls enforced by the key mechanism are undermined.
Real-world patterns that exacerbate this include:
- Using a global cache namespace without tenant or key partitioning.
- Caching responses that contain user-specific or role-specific data based on unvalidated query parameters.
- Relying on framework-level caching hooks that do not automatically incorporate authorization-derived context such as resolved API key metadata.
These issues map to OWASP API Top 10 controls around authentication and authorization scoping. middleBrick’s LLM/AI Security checks and 12 parallel scans (Authentication, BOLA/IDOR, Property Authorization, Input Validation, Rate Limiting, Data Exposure, Encryption, SSRF, Inventory Management, Unsafe Consumption, and BFLA/Privilege Escalation) can surface cache poisoning risks by correlating runtime behavior with OpenAPI/Swagger definitions, including $ref resolution, to detect missing authorization context in caching-related paths.
Api Keys-Specific Remediation in Feathersjs — concrete code fixes
Remediation focuses on ensuring the cache key incorporates the API key’s resolved tenant or subject, and that API key validation is performed before any cache lookup or write. Below are concrete, syntactically correct Feathersjs snippets that demonstrate secure handling.
1. Validate and resolve the API key before caching
Use a hook that resolves the API key to a tenant or user ID and attaches it to the context. This resolved value should be part of any cache key.
// src/hooks/resolve-api-key.js
module.exports = function () {
return async context => {
const { apikey } = context.params.headers || {};
if (!apikey) {
throw new Error('Unauthorized');
}
// Replace with your key lookup; ensure key is bound to tenantId
const keyRecord = await context.app.service('api-keys').get(apikey);
if (!keyRecord || keyRecord.revoked) {
throw new Error('Invalid API key');
}
// Attach tenant-aware identifier to context for cache key construction
context.params.cacheContext = {
tenantId: keyRecord.tenantId,
subject: keyRecord.subject || 'service'
};
return context;
};
};
2. Build a cache key that includes tenant and subject
In your service or transport adapter, include resolved tenant and subject in the cache key to prevent cross-tenant leakage.
// src/hooks/cache-key.js
module.exports = function () {
return async context => {
const cacheContext = context.params.cacheContext;
const path = context.path || '';
const query = context.params.query || {};
// Deterministic query normalization to avoid cache fragmentation by ordering
const sortedQuery = Object.keys(query)
.sort()
.map(k => `${encodeURIComponent(k)}=${encodeURIComponent(query[k])}`)
.join('&');
const cacheKey = `cache:${cacheContext.tenantId}:${cacheContext.subject}:${path}:${sortedQuery}`;
context.params.cacheKey = cacheKey;
return context;
};
};
3. Use the cache key in your caching adapter
Ensure reads/writes to the cache use the tenant-aware key. This example uses a generic getFromCache / setInCache abstraction; adapt to your provider (Redis, Memcached, etc.).
// src/hooks/apply-cache.js
module.exports = function () {
return async context => {
if (context.params.method !== 'find' && context.params.method !== 'get') {
return context;
}
const cacheKey = context.params.cacheKey;
if (!cacheKey) {
return context;
}
const cached = await getFromCache(cacheKey);
if (cached) {
context.result = cached;
context.isCached = true;
return context;
}
// Proceed to service; after result, store in cache
return context;
};
async function getFromCache(key) {
// Implement with your cache client; example placeholder
// return redis.get(key);
return null;
}
};
4. Example service registration with hooks
Wire the hooks into a Feathers service so they execute in order before and after the provider.
// src/services/messages/messages.service.js
const resolveApiKey = require('../../hooks/resolve-api-key');
const buildCacheKey = require('../../hooks/cache-key');
const applyCache = require('../../hooks/apply-cache');
const cacheAfter = require('../../hooks/cache-after');
module.exports = function (app) {
const options = {
name: 'messages',
paginate: { default: 10, max: 25 }
};
// Initialize service with options
app.use('/messages', createService(options));
const messagesService = app.service('messages');
// Before hooks: resolve key and build cache key
messagesService.hooks({
before: {
all: [resolveApiKey(), buildCacheKey(), applyCache()]
},
after: {
all: [cacheAfter()]
}
});
};
5. Cache after hook to write responses into cache
Ensure successful responses are cached using the tenant-aware key, and avoid caching error responses that may expose information.
// src/hooks/cache-after.js
module.exports = function () {
return async context => {
if (context.params.isCached || context.result == null) {
return context;
}
// Only cache safe methods and successful results
if (context.method !== 'find' && context.method !== 'get') {
return context;
}
const cacheKey = context.params.cacheKey;
if (!cacheKey) {
return context;
}
await setInCache(cacheKey, context.result);
return context;
};
async function setInCache(key, value) {
// Implement with your cache client; example placeholder
// await redis.set(key, JSON.stringify(value), 'EX', 300);
}
};
These examples ensure API key context is incorporated into cache decisions, reducing the risk of cross-user cache poisoning. middleBrick’s GitHub Action can be added to CI/CD pipelines to fail builds if risk scores drop, while the CLI and Web Dashboard help track findings and remediation over time.