Heap Overflow in Adonisjs
How Heap Overflow Manifests in Adonisjs
Heap overflow vulnerabilities in Adonisjs applications typically arise from improper handling of user-controlled data in memory-intensive operations. In Node.js/Express-based frameworks like Adonisjs, these vulnerabilities manifest through several specific patterns that developers must understand to defend against them.
The most common heap overflow scenario in Adonisjs involves file upload processing. When handling multipart form data with multipart: true in route definitions, Adonisjs uses the formidable library to parse incoming files. If an attacker sends an exceptionally large file or crafts a multipart request with numerous small files, the memory buffer allocated for processing can exceed available heap space, causing the Node.js process to crash.
// Vulnerable Adonisjs route definition
Route.post('upload', async ({ request }) => {
const profile = await request.multipart();
// No size limits or validation
const file = await profile.file('avatar');
// Process file directly without size checks
const contents = await file.stream().toString();
return { success: true, size: contents.length };
})Another manifestation occurs in JSON body parsing. Adonisjs uses the bodyparser middleware by default, which can be exploited through deeply nested JSON objects or arrays with millions of elements. An attacker can craft a request that creates a JSON structure requiring exponential memory allocation during parsing:
// Malicious request that could trigger heap overflow
{
"data": [ /* 1 million nested arrays */ ]
}Adonisjs applications are also vulnerable when processing database query results without pagination. A malicious query returning millions of rows can exhaust heap memory during result set processing. This is particularly dangerous in API endpoints that don't implement proper limits:
// Vulnerable query without pagination
Route.get('users', async ({ request, response }) => {
const users = await Database.query().from('users');
return response.json(users);
})Third-party package vulnerabilities can introduce heap overflow risks. Adonisjs's plugin ecosystem means applications might inherit vulnerabilities from dependencies. For example, a package that processes XML or CSV data without size limits could be exploited to trigger heap overflow through specially crafted input.
Adonisjs-Specific Detection
Detecting heap overflow vulnerabilities in Adonisjs requires a combination of static analysis, runtime monitoring, and specialized scanning tools. The framework's architecture provides specific entry points where heap overflow risks concentrate, making targeted detection strategies effective.
Static code analysis should focus on route handlers that process file uploads, JSON bodies, and database queries. Look for patterns where input size isn't validated before processing. Tools like ESLint with custom rules can flag suspicious patterns:
// Custom ESLint rule for Adonisjs heap overflow detection
module.exports = {
create: function(context) {
return {
CallExpression(node) {
if (node.callee.property?.name === 'file' ||
node.callee.property?.name === 'json' ||
node.callee.property?.name === 'text') {
// Check if size validation exists
const hasSizeCheck = node.parent &&
node.parent.arguments &&
node.parent.arguments.some(arg =>
arg.type === 'Literal' && arg.value > 0);
if (!hasSizeCheck) {
context.report({
node,
message: 'Potential heap overflow: no size validation'
});
}
}
}
};
}
};Runtime monitoring is crucial for detecting heap overflow attempts in production. Node.js provides memory usage metrics through process.memoryUsage() that can be monitored:
// Memory monitoring middleware for Adonisjs
const monitorMemory = async (ctx, next) => {
const initialMemory = process.memoryUsage().heapUsed;
await next();
const finalMemory = process.memoryUsage().heapUsed;
const memoryDelta = finalMemory - initialMemory;
// Alert if memory usage increased by more than 100MB
if (memoryDelta > 100 * 1024 * 1024) {
console.warn(`Memory spike detected: ${memoryDelta} bytes`);
// Log request details for analysis
}
};
// Apply as global middleware
Route.middleware(['monitorMemory']);Automated security scanning with tools like middleBrick provides comprehensive heap overflow detection by testing the actual runtime behavior of your Adonisjs API endpoints. The scanner sends crafted payloads to identify memory exhaustion vulnerabilities:
# Using middleBrick CLI to scan Adonisjs API
npm install -g middlebrick
# Scan your Adonisjs API endpoint
middlebrick scan https://api.yourservice.com/upload
# Output shows heap overflow risk assessment
{
"risk_score": 65,
"heap_overflow_risk": "HIGH",
"findings": [
{
"severity": "critical",
"category": "Input Validation",
"description": "File upload endpoint lacks size limits"
}
]
}middleBrick's black-box scanning approach is particularly effective because it tests the actual runtime behavior without requiring source code access. The scanner sends progressively larger payloads to identify heap overflow vulnerabilities, providing actionable findings with specific remediation guidance.
Adonisjs-Specific Remediation
Remediating heap overflow vulnerabilities in Adonisjs requires a multi-layered approach that combines input validation, memory management, and defensive coding practices. The framework's middleware system and configuration options provide several native mechanisms for protection.
Start with configuring bodyparser limits in config/bodyparser.js. Adonisjs allows setting maximum sizes for different content types:
// config/bodyparser.js
module.exports = {
multipart: {
autoProcess: true,
maxFileSize: '10mb', // Limit file uploads to 10MB
maxFieldSize: '1mb', // Limit form field sizes
maxFields: 100, // Limit number of form fields
},
json: {
limit: '1mb', // Limit JSON body size
strict: true, // Reject non-JSON content
},
urlencoded: {
limit: '1mb',
extended: true,
}
};For file uploads, implement explicit size validation before processing:
// Secure file upload handler
Route.post('upload', async ({ request, response }) => {
const profile = await request.multipart();
// Validate file size before processing
const file = await profile.file('avatar');
if (!file) {
return response.badRequest({ error: 'No file uploaded' });
}
const fileSize = await file.size();
if (fileSize > 10 * 1024 * 1024) { // 10MB limit
return response.badRequest({
error: 'File too large, max 10MB'
});
}
// Process file safely
const contents = await file.stream().toString('utf8', {
highWaterMark: 64 * 1024 // Process in 64KB chunks
});
return response.json({ success: true, size: fileSize });
})Implement pagination for database queries to prevent heap overflow from large result sets:
// Safe database query with pagination
Route.get('users', async ({ request, response }) => {
const page = request.input('page', 1);
const limit = request.input('limit', 50);
// Validate pagination parameters
if (page < 1 || limit < 1 || limit > 100) {
return response.badRequest({
error: 'Invalid pagination parameters'
});
}
const users = await Database
.query()
.from('users')
.paginate(page, limit);
return response.json(users);
})Use streaming for processing large data instead of loading everything into memory:
// Stream-based file processing
Route.post('import', async ({ request, response }) => {
const profile = await request.multipart();
const file = await profile.file('data');
if (!file) {
return response.badRequest({ error: 'No file uploaded' });
}
// Stream processing to avoid heap overflow
const chunks = [];
const stream = file.stream();
return new Promise((resolve) => {
stream
.on('data', (chunk) => {
// Process in small chunks
chunks.push(chunk);
if (chunks.length > 1000) {
// Limit memory usage
stream.destroy();
return resolve(response.badRequest({
error: 'File too large to process'
}));
}
})
.on('end', () => {
const data = Buffer.concat(chunks).toString();
// Process data safely
resolve(response.json({ success: true }));
})
.on('error', (err) => {
resolve(response.internalServerError({
error: 'File processing failed'
}));
});
});
})Implement circuit breakers for memory-intensive operations:
// Memory-aware circuit breaker
const withMemoryGuard = async (fn, maxMemoryIncrease = 50 * 1024 * 1024) => {
const initialMemory = process.memoryUsage().heapUsed;
try {
const result = await fn();
const finalMemory = process.memoryUsage().heapUsed;
if (finalMemory - initialMemory > maxMemoryIncrease) {
throw new Error('Memory limit exceeded');
}
return result;
} catch (error) {
// Log and handle memory overflow
console.error('Memory guard triggered:', error.message);
throw error;
}
};
// Use with memory-intensive operations
Route.post('process', async ({ request }) => {
return await withMemoryGuard(async () => {
// Memory-intensive operation here
const result = await processLargeDataset();
return { success: true, result };
});
});