Pii Leakage in Firestore
How Pii Leakage Manifests in Firestore
PII leakage in Firestore occurs through several Firestore-specific patterns that developers often overlook. The most common scenario involves improper security rules combined with overly permissive data structures.
Consider this vulnerable Firestore structure:
// Vulnerable data model
users
{userId}
name: "John Doe"
email: "[email protected]"
ssn: "123-45-6789"
paymentInfo
cardNumber: "4111 1111 1111 1111"
cvv: "123"
The security rules might look reasonable but contain critical gaps:
// Vulnerable Firestore security rules
match /databases/{database}/documents {
match /users/{userId} {
allow read: if request.auth.uid == userId;
allow write: if request.auth.uid == userId;
}
match /users/{userId}/paymentInfo {
allow read: if request.auth.uid == userId;
allow write: if request.auth.uid == userId;
}
}
The problem? These rules allow any authenticated user to read any other user's document structure through enumeration attacks. An attacker can:
- Iterate through user IDs to discover which exist
- Read document metadata (size, timestamps) without accessing PII
- Trigger Cloud Functions that log PII to accessible locations
- Exploit Firestore's
listDocuments()API to enumerate collections
Another Firestore-specific vector: Cloud Functions triggered by Firestore writes. If a function logs sensitive data without proper sanitization:
// Vulnerable Cloud Function
exports.logUserData = functions.firestore
.document('users/{userId}')
.onCreate((snap, context) => {
const data = snap.data();
console.log(`New user: ${JSON.stringify(data)}`); // Logs PII to accessible logs
return null;
});
Firestore's client-side SDKs also introduce risks when developers use onSnapshot() listeners without proper security rule validation, potentially exposing data during offline sync operations.
Firestore-Specific Detection
Detecting PII leakage in Firestore requires examining both security rules and data access patterns. Here's how to identify vulnerabilities:
Security Rule Analysis
Use the Firebase CLI to export your rules:
firebase firestore:rules:get > firestore-rules.json
Then analyze for permissive patterns:
import json
def analyze_firestore_rules(rules_file):
with open(rules_file) as f:
rules = json.load(f)
issues = []
for match in rules.get('rules', {}).get('match', []):
if 'allow read' in match and 'request.auth.uid' not in match['allow']['read']:
issues.append(f"Permissive read: {match['path']}")
if 'allow write' in match and 'request.auth.uid' not in match['allow']['write']:
issues.append(f"Permissive write: {match['path']}")
return issues
Manual Enumeration Testing
Test for enumeration vulnerabilities:
// Test for user enumeration
async function testUserEnumeration(firestore, testUserId) {
try {
const doc = await firestore.collection('users').doc(testUserId).get();
if (doc.exists) {
console.log('Vulnerable: User enumeration possible');
} else {
console.log('Safe: User does not exist');
}
} catch (error) {
console.log('Safe: Access denied or error');
}
}
middleBrick Scanning
middleBrick specifically detects Firestore PII leakage through:
- Authentication bypass attempts on Firestore endpoints
- Property authorization violations in nested documents
- Input validation failures that allow injection into queries
- Data exposure through overly broad security rules
Scan your Firestore-backed API:
npm install -g middlebrick
middlebrick scan https://your-firestore-api.example.com
The scanner tests 12 security categories including Authentication and Data Exposure, specifically looking for Firestore's unique patterns like listDocuments() abuse and Cloud Function logging vulnerabilities.
Firestore-Specific Remediation
Fixing PII leakage in Firestore requires defense-in-depth using multiple Firebase features:
Enhanced Security Rules
// Secure Firestore security rules
match /databases/{database}/documents {
match /users/{userId} {
// Only allow access to your own document
allow read, write: if request.auth.uid == userId;
// Prevent enumeration by returning empty on unauthorized access
allow list: if request.auth.uid == userId;
}
match /users/{userId}/paymentInfo {
// Require authentication and ownership
allow read, write: if request.auth.uid == userId;
}
// Prevent collection enumeration
match /{document=**} {
allow read, write: if false;
}
}
Data Minimization
Store only necessary PII and use Firestore's document-level encryption:
import * as crypto from 'crypto';
// Encrypt sensitive fields before storing
async function encryptUserData(userId, userData) {
const docRef = firestore.collection('users').doc(userId);
// Encrypt PII fields
const encrypted = {
name: encryptField(userData.name),
email: encryptField(userData.email),
ssn: encryptField(userData.ssn),
paymentInfo: encryptPaymentInfo(userData.paymentInfo)
};
await docRef.set(encrypted);
}
function encryptField(plainText) {
const algorithm = 'aes-256-cbc';
const key = process.env.ENCRYPTION_KEY;
const iv = crypto.randomBytes(16);
const cipher = crypto.createCipheriv(algorithm, key, iv);
const encrypted = Buffer.concat([cipher.update(plainText), cipher.final()]);
return iv.toString('hex') + ':' + encrypted.toString('hex');
}
Cloud Function Security
Sanitize logs and implement proper access controls:
// Secure Cloud Function
exports.logUserData = functions.firestore
.document('users/{userId}')
.onCreate(async (snap, context) => {
const userId = context.params.userId;
const data = snap.data();
// Sanitize PII before logging
const sanitizedLog = {
userId: userId,
name: maskString(data.name),
email: maskString(data.email)
};
console.log(`New user: ${JSON.stringify(sanitizedLog)}`);
// Store audit trail in secure location
await firestore.collection('auditLogs')
.add({
userId: userId,
action: 'user_created',
timestamp: admin.firestore.FieldValue.serverTimestamp(),
sensitiveData: null // Never store raw PII
});
return null;
});
function maskString(value) {
return value ? value[0] + '*' .repeat(Math.max(0, value.length - 2)) + value.slice(-1) : value;
}
Client-Side Safeguards
Implement proper data handling in your frontend:
// Secure Firestore client usage
import { getDoc, doc } from 'firebase/firestore';
async function getUserData(userId) {
const userDoc = doc(firestore, 'users', userId);
try {
const userSnap = await getDoc(userDoc);
if (userSnap.exists()) {
const data = userSnap.data();
// Filter sensitive fields before using
const safeData = {
id: userId,
name: data.name,
email: data.email,
// Do not expose paymentInfo or ssn
};
return safeData;
}
} catch (error) {
console.error('Error fetching user data:', error);
return null;
}
}
Related CWEs: dataExposure
| CWE ID | Name | Severity |
|---|---|---|
| CWE-200 | Exposure of Sensitive Information | HIGH |
| CWE-209 | Error Information Disclosure | MEDIUM |
| CWE-213 | Exposure of Sensitive Information Due to Incompatible Policies | HIGH |
| CWE-215 | Insertion of Sensitive Information Into Debugging Code | MEDIUM |
| CWE-312 | Cleartext Storage of Sensitive Information | HIGH |
| CWE-359 | Exposure of Private Personal Information (PII) | HIGH |
| CWE-522 | Insufficiently Protected Credentials | CRITICAL |
| CWE-532 | Insertion of Sensitive Information into Log File | MEDIUM |
| CWE-538 | Insertion of Sensitive Information into Externally-Accessible File | HIGH |
| CWE-540 | Inclusion of Sensitive Information in Source Code | HIGH |