HIGH jwt misconfigurationfeathersjsdynamodb

Jwt Misconfiguration in Feathersjs with Dynamodb

Jwt Misconfiguration in Feathersjs with Dynamodb — how this specific combination creates or exposes the vulnerability

FeathersJS is a framework that often uses JWT-based authentication via @feathersjs/authentication and @feathersjs/authentication-jwt. When combined with Amazon DynamoDB as the user record store, misconfigurations typically arise from how identity is resolved and how token validation is performed, leading to authentication bypass or insecure direct object references (IDOR) in an unauthenticated scan.

One common pattern is storing user records in DynamoDB with a composite key such as PK = USER# and SK = METADATA. If the FeathersJS JWT service configuration does not correctly map the JWT subject (sub) to the DynamoDB key, an attacker can supply any sub and the service may return another user’s record due to a weak identity resolution function. For example, a naive get implementation might do:

const { user_id } = req.params; // from JWT 'sub' claim
const params = {
  TableName: process.env.USERS_TABLE,
  Key: {
    PK: { S: \`USER#${user_id}\` },
    SK: { S: 'METADATA' }
  }
};
const result = await dynamodb.get(params).promise();

If user_id is directly interpolated without verifying it matches the authenticated subject or enforcing additional authorization checks, this becomes a BOLA (Broken Object Level Authorization) vector. An unauthenticated scan by middleBrick can detect that the endpoint does not validate the token’s sub against the requested resource identifier, enabling horizontal IDOR across user records stored in DynamoDB.

Another misconfiguration involves token validation settings. The JWT secret or public key may be misconfigured so that the algorithm is not explicitly set, allowing none algorithm tokens to be accepted. In DynamoDB, user metadata might include fields like role or permissions. If the application trusts the JWT payload without re-checking DynamoDB for the latest role or consent flags, privilege escalation can occur. For example, a token issued with role: user could be altered (if none is allowed) to role: admin, and the app may skip reloading permissions from DynamoDB, granting elevated access.

Additionally, if the FeathersJS JWT service does not set proper aud (audience) or iss (issuer) validation, tokens issued for other services or environments may be accepted. middleBrick’s LLM/AI Security checks can detect missing audience validation, which when paired with DynamoDB user data, may expose account information across systems. Input validation checks will also highlight that untrusted claims (such as email or custom:scope) are used to directly construct DynamoDB keys or queries without canonicalization, enabling injection or malformed key errors that lead to information leakage.

The combination of DynamoDB’s key-based model, permissive JWT validation, and missing re-authorization on each request creates a scenario where unauthenticated or low-privilege actors can enumerate or manipulate user identifiers. middleBrick’s BOLA/IDOR and Authentication checks are designed to surface these gaps by probing endpoints without credentials and comparing token claims to returned DynamoDB records, ensuring that identity resolution is tightly bound to verified token subjects and least-privilege access patterns.

Dynamodb-Specific Remediation in Feathersjs — concrete code fixes

Remediation centers on strict identity mapping, explicit token validation, and re-authorization on every data access. Below are concrete, DynamoDB-integrated code examples for a FeathersJS service that mitigate JWT misconfiguration risks.

1. Enforce subject-to-key mapping and use DynamoDB GetItem with the authenticated subject only. Do not trust request parameters for object ownership.

const { AuthenticationService } = require('@feathersjs/authentication');
const { DynamoDBDocumentClient, GetCommand } = require('@aws-sdk/lib-dynamodb');
const { dynamodbClient } = require('./dynamodb-client'); // configured client

const ddb = DynamoDBDocumentClient.from(dynamodbClient);

class UserProfileService {
  async get(id, params) {
    // id here must be the authenticated subject (sub), not user-supplied
    const subject = params.user.sub; // from JWT
    const params = {
      TableName: process.env.USERS_TABLE,
      Key: {
        PK: { S: \`USER#${subject}\` },
        SK: { S: 'METADATA' }
      }
    };
    const { Item } = await ddb.send(new GetCommand(params));
    if (!Item) {
      throw new Error('Not found');
    }
    return Item;
  }
}
module.exports = function () {
  const app = this;
  app.use('/profile', new UserProfileService());
};

2. Explicitly set JWT validation options including algorithms, issuer, and audience. This prevents none algorithm acceptance and token misuse across environments.

const authentication = new AuthenticationService({
  secret: process.env.JWT_SECRET,
  strategies: ['jwt'],
  jwt: {
    secret: process.env.JWT_SECRET,
    algorithms: ['HS256'],
    issuer: 'https://api.example.com',
    audience: 'example-api',
    cache: false
  }
});
app.configure(authentication);

3. Re-authorize on each request by validating token claims against DynamoDB metadata. Do not rely on cached permissions in the token for sensitive operations.

async function ensureRoleMatches(tokenSub, requiredRole) {
  const params = {
    TableName: process.env.USERS_TABLE,
    Key: {
      PK: { S: \`USER#${tokenSub}\` },
      SK: { S: 'METADATA' }
    }
  };
  const { Item } = await ddb.send(new GetCommand(params));
  return Item && Item.role === requiredRole;
}

app.service('admin-action').hooks({
  before: {
    all: [async context => {
      const sub = context.params.user.sub;
      const hasPermission = await ensureRoleMatches(sub, 'admin');
      if (!hasPermission) {
        throw new Error('Unauthorized');
      }
      return context;
    }]
  }
});

4. Canonicalize and validate all identity inputs before constructing DynamoDB keys. Avoid string concatenation with raw user input; use consistent casing and encoding.

const sanitizeSubject = (sub) => {
  if (!sub || typeof sub !== 'string') throw new Error('Invalid subject');
  return sub.trim().toLowerCase();
};

const subject = sanitizeSubject(payload.sub);
const safeKey = \`USER#${subject}\`;
// Use safeKey for DynamoDB operations only after verifying JWT validity

By combining strict JWT validation, subject-bound DynamoDB access patterns, and per-request authorization checks, the risk of JWT misconfiguration in a FeathersJS + DynamoDB stack is substantially reduced. middleBrick’s scans can be used iteratively to confirm that endpoints properly enforce authentication and that DynamoDB key construction does not expose identity bypass or IDOR.

Related CWEs: authentication

CWE IDNameSeverity
CWE-287Improper Authentication CRITICAL
CWE-306Missing Authentication for Critical Function CRITICAL
CWE-307Brute Force HIGH
CWE-308Single-Factor Authentication MEDIUM
CWE-309Use of Password System for Primary Authentication MEDIUM
CWE-347Improper Verification of Cryptographic Signature HIGH
CWE-384Session Fixation HIGH
CWE-521Weak Password Requirements MEDIUM
CWE-613Insufficient Session Expiration MEDIUM
CWE-640Weak Password Recovery HIGH

Frequently Asked Questions

Can middleBrick detect JWT misconfiguration when DynamoDB is used as the user store?
Yes. middleBrick runs unauthenticated checks that inspect token validation settings and identity resolution patterns. It flags missing issuer/audience enforcement and BOLA/IDOR risks that can arise when JWT subjects are directly mapped to DynamoDB keys without proper authorization checks.
Does fixing JWT validation alone fully secure the DynamoDB integration?
No. JWT validation must be paired with least-privilege IAM policies for DynamoDB, strict subject-to-key mapping, and per-request re-authorization. middleBrick’s scans highlight missing authorization checks and over-privileged token claims that could still lead to IDOR even when algorithm and issuer are correctly set.