Dangling Dns in Express with Dynamodb
Dangling Dns in Express with Dynamodb — how this specific combination creates or exposes the vulnerability
A dangling DNS configuration in an Express application that uses Amazon DynamoDB can expose both application logic and infrastructure risks. In this setup, client-supplied values—such as table names, partition key values, or conditional expression attribute names—are used to construct DynamoDB API calls. If these values are not strictly validated and sanitized, an attacker can supply a hostname or CNAME that resolves to an unintended internal or external endpoint, effectively creating a dangling DNS reference that bypasses intended routing and access controls.
Consider an Express route that accepts a tableSuffix to build DynamoDB table names for a multi-tenant design:
// Risky: table name derived from user input
app.get('/records/:tableSuffix/:id', async (req, res) => {
const { tableSuffix, id } = req.params;
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: `app-data-${tableSuffix}`,
Key: { id }
};
const data = await docClient.get(params).promise();
res.json(data.Item);
});
If an attacker provides a tableSuffix that resolves via DNS to a service they control (for example, by registering a domain that resolves to a CNAME pointing into an internal AWS endpoint or an external host), the application may follow an unintended resolution path. This can lead to data being sent to or fetched from an unauthorized endpoint, effectively creating a dangling DNS vector that bypasses intended table boundaries. The same risk applies when using input to construct conditional expression attribute names in UpdateItem or BatchGetItem requests.
DynamoDB itself does not perform DNS resolution, so the vulnerability lies in how the Express layer interprets and uses user input to form resource identifiers and endpoint references. Without strict allowlists and canonicalization, the application can be tricked into directing requests to unexpected targets. This becomes especially problematic when combined with overly permissive IAM roles attached to the DynamoDB client, as unintended writes or reads may follow the dangling reference.
To detect this pattern, middleBrick scans for user-controlled inputs that influence DynamoDB table names, key schema fields, and expression attribute names, then evaluates whether the constructed resource identifiers could resolve outside the intended scope. The LLM/AI Security checks specifically probe for prompt injection and data exfiltration paths that could be chained through misconfigured endpoints, while the BFLA/Privilege Escalation and Property Authorization checks validate whether access controls properly restrict tenant boundaries.
Dynamodb-Specific Remediation in Express — concrete code fixes
Remediation centers on strict input validation, canonicalization, and avoiding dynamic table or key construction from untrusted data. Use allowlists for table suffixes, enforce strict type checks, and parameterize queries instead of concatenating strings. Below are concrete, safe patterns for Express with DynamoDB.
1. Use an allowlist for tenant identifiers
Map known, valid suffixes to canonical table names. Do not trust client input to construct table names directly.
const ALLOWED_TENANTS = {
tenantA: 'app-data-tenantA',
tenantB: 'app-data-tenantB'
};
app.get('/records/:tenant/:id', async (req, res) => {
const { tenant, id } = req.params;
const tableName = ALLOWED_TENANTS[tenant];
if (!tableName) {
return res.status(400).json({ error: 'invalid tenant' });
}
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: tableName,
Key: { id }
};
const data = await docClient.get(params).promise();
res.json(data.Item || {});
});
2. Validate and canonicalize key values
Ensure partition and sort keys conform to expected patterns (e.g., UUIDs, numeric IDs) before using them in DynamoDB requests.
app.get('/records/:table/:id', async (req, res) => {
const { table, id } = req.params;
// strict pattern for an ID: 36-character UUID
if (!/^[0-9a-f]{8}-[0-9a-f]{4}-[1-5][0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i.test(id)) {
return res.status(400).json({ error: 'invalid id format' });
}
const allowedTables = new Set(['app-data-prod', 'app-data-staging']);
if (!allowedTables.has(table)) {
return res.status(400).json({ error: 'invalid table' });
}
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: table,
Key: { id }
};
const data = await docClient.get(params).promise();
res.json(data.Item || {});
});
3. Avoid dynamic expression attribute names from user input
Do not allow clients to supply attribute names used in UpdateItem ExpressionAttributeNames. Use a controlled mapping instead.
app.patch('/records/:table/:id', async (req, res) => {
const { table, id } = req.params;
const { updates } = req.body; // { status: 'active' }
const allowedTables = new Set(['app-data-prod', 'app-data-staging']);
if (!allowedTables.has(table)) {
return res.status(400).json({ error: 'invalid table' });
}
// Map safe logical names to attribute names
const attributeNames = {
'#status': 'status',
'#updatedAt': 'updatedAt'
};
const updateExpressions = [];
const expressionAttrNames = {};
const expressionAttrValues = {};
for (const [key, value] of Object.entries(updates)) {
if (!['status', 'updatedAt'].includes(key)) {
continue; // skip unsupported fields
}
const nameToken = `#${key}`;
updateExpressions.push(`${nameToken} = :${key}`);
expressionAttrNames[nameToken] = key;
expressionAttrValues[`:${key}`] = value;
}
if (updateExpressions.length === 0) {
return res.status(400).json({ error: 'no valid updates' });
}
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: table,
Key: { id },
UpdateExpression: 'SET ' + updateExpressions.join(', '),
ExpressionAttributeNames: expressionAttrNames,
ExpressionAttributeValues: expressionAttrValues,
ReturnValues: 'UPDATED_NEW'
};
const data = await docClient.update(params).promise();
res.json(data.Attributes);
});
4. Enforce least privilege for the DynamoDB client
Ensure the IAM role associated with the application grants only the permissions required for the intended operations on the specific tables. Avoid wildcard actions and resource ARNs that allow cross-tenant access.
5. Use middleBrick to validate your configuration
Run middleBrick scans against your Express endpoints to surface dangling DNS risks, improper authorization, and input validation issues. The CLI provides quick feedback:
middlebrick scan https://api.example.com/records/tenantA/123e4567-e89b-12d3-a456-426614174000
With the Pro plan, you can enable continuous monitoring and CI/CD integration to catch regressions before deployment. The MCP Server allows you to run scans directly from your IDE while developing these patterns.