Prototype Pollution in Actix with Dynamodb
Prototype Pollution in Actix with Dynamodb — how this specific combination creates or exposes the vulnerability
Prototype pollution in an Actix web service that interacts with Amazon DynamoDB occurs when user-controlled input is merged into objects used to construct DynamoDB attribute values or condition expressions. In Rust, this often manifests through serde deserialization of JSON payloads into structs or generic serde_json::Value shapes. If the application merges user-supplied keys into a base object (e.g., using libraries that perform recursive merge or object extension), an attacker can inject properties like __proto__, constructor, or other special keys that affect object behavior in JavaScript contexts or within custom middleware that interprets data before serialization.
When the polluted object is passed to DynamoDB operations—such as PutItem, UpdateItem, or `ConditionExpression`—the injected properties can change the semantics of the request. For example, a merged key might override expected attribute values, bypass expected type checks, or influence how conditional writes are evaluated. DynamoDB itself does not execute JavaScript, so the exploitation path typically involves the application layer: the polluted object leads to unexpected item states, privilege escalation via attribute-based access control (ABAC) logic, or injection into expression parameter values that are later evaluated by the application before being sent to DynamoDB.
Actix-specific risk arises when handlers use mutable state or shared data structures that are updated based on incoming JSON before being used to build DynamoDB requests. Consider a handler that starts with a base serde_json::json! object and merges web::Json input into it. If an attacker sends { "_id": "user123", "__proto__": { "isAdmin": true } }, the merge can pollute the resulting object. When this object is used in a DynamoDB UpdateItem expression like SET #data = :val, the application might serialize the polluted object and feed unexpected values into expression attribute values, potentially changing data types or bypassing application-level guards.
Additionally, if the Actix service uses DynamoDB ConditionExpression to enforce invariants (e.g., preventing updates when a flag is set), prototype pollution can alter condition outcomes. An injected attribute used in the condition may change the logical evaluation, leading to unauthorized updates. Because DynamoDB stores data as name–type–value triples, the pollution affects how items are read and compared, not the database engine itself, but the application’s interpretation of those items.
Real-world patterns include using serde_json::Map for dynamic schemas or ORM-like layers that build PutItemInput or UpdateItemInput from merged sources. The combination of a dynamic language-like merge strategy in Rust (using libraries that allow key overwriting) and the structured-yields-flexible nature of DynamoDB expressions creates a clear path for prototype pollution to affect integrity and authorization checks.
Dynamodb-Specific Remediation in Actix — concrete code fixes
To remediate prototype pollution in an Actix service interacting with DynamoDB, validate and sanitize all incoming JSON before it influences DynamoDB request construction. Avoid recursive merges of user data into base objects; instead, use strongly typed structs and explicit field mapping. When dynamic updates are required, prefer DynamoDB’s native update expressions with explicit attribute names and condition checks, and ensure expression attribute values are derived from validated inputs only.
Below are concrete Actix handler examples with DynamoDB code that demonstrate secure patterns.
Example 1: Strong typing with serde and explicit DynamoDB input
use actix_web::{post, web, HttpResponse};
use aws_sdk_dynamodb::Client;
use serde::{Deserialize, Serialize};
#[derive(Deserialize, Serialize, Debug)]
struct CreateUser {
user_id: String,
email: String,
// Do not include untyped fields that could be polluted
}
#[post("/users")]
async fn create_user(
payload: web::Json,
ddb: web::Data,
) -> HttpResponse {
let item = aws_sdk_dynamodb::types::AttributeValue::from(
serde_dynamodb::to_hashmap(&payload.into_inner()).unwrap(),
);
ddb.put_item()
.table_name("Users")
.set_item(Some(item))
.send()
.await
.map(|_| HttpResponse::Ok().finish())
.unwrap_or_else(|_| HttpResponse::InternalServerError().finish())
}
Example 2: Safe dynamic updates using expression attribute names/values
use aws_sdk_dynamodb::types::AttributeValue;
use actix_web::web;
fn build_update_expression(
base_key: &str,
updates: &serde_json::Map,
) -> (String, Vec) {
let mut expression = String::from("SET ");
let mut values = Vec::new();
for (i, (k, v)) in updates.iter().enumerate() {
if i > 0 {
expression.push_str(", ");
}
let attr_name = format!("#field{}", i);
let val_name = format!(":val{}", i);
expression.push_str(&format!("#field{} = {}", attr_name, val_name));
// Explicitly validate types; do not merge raw user JSON into expression values
values.push(AttributeValue::from(v.as_str().unwrap_or_default().to_string()));
}
(expression, values)
}
async fn update_user_preferences(
ddb: web::Data,
path: web::Path,
body: web::Json,
) -> HttpResponse {
let user_id = path.into_inner();
// Validate allowed keys to prevent injection of special properties
let allowed = ["email", "theme", "notifications"];
let filtered: serde_json::Map<_, _> = body
.iter()
.filter(|(k, _)| allowed.contains(&k.as_str()))
.map(|(k, v)| (k.clone(), v.clone()))
.collect();
let (expr, vals) = build_update_expression("user_data", &filtered);
ddb.update_item()
.table_name("Users")
.key("user_id", AttributeValue::S(user_id))
.update_expression(expr)
.set_expression_attribute_names(Some(
vec![("#field0", "email"), ("#field1", "theme"), ("#field2", "notifications")]
.into_iter()
.collect(),
))
.set_expression_attribute_values(Some(
vals.into_iter()
.enumerate()
.map(|(i, v)| (format!(":val{}", i), v))
.collect(),
))
.condition_expression("attribute_exists(user_id)")
.send()
.await
.map(|_| HttpResponse::Ok().finish())
.unwrap_or_else(|_| HttpResponse::InternalServerError().finish())
}
Example 3: Avoiding merges; using patch-like semantics with validation
use actix_web::web;
fn apply_validated_patch(
base: &mut serde_json::Map,
patch: &serde_json::Map,
) -> Result<(), &'static str> {
for (key, value) in patch {
if !["email", "locale", "timezone"].contains(&key.as_str()) {
return Err("disallowed key in patch");
}
if value.is_string() {
base.insert(key.clone(), value.clone());
} else {
return Err("value must be a string");
}
}
Ok(())
}
async fn patch_user(
ddb: web::Data,
path: web::Path,
patch: web::Json,
) -> HttpResponse {
let user_id = path.into_inner();
let mut base: serde_json::Map = serde_json::map! {
"user_id".to_string() => AttributeValue::S(user_id).into(),
};
if let Ok(patch_map) = patch.as_object() {
if apply_validated_patch(&mut base, patch_map).is_ok() {
let item = AttributeValue::from(
serde_dynamodb::to_hashmap(&base).unwrap(),
);
ddb.put_item()
.table_name("Users")
.set_item(Some(item))
.send()
.await
.ok();
}
}
HttpResponse::Accepted().finish()
}
Key remediation principles:
- Do not use generic merge functions on user-controlled JSON; prefer deserialization into known structs.
- When dynamic updates are necessary, use DynamoDB expression attribute names/values and explicitly validate allowed fields and types.
- Treat incoming data as untrusted; sanitize before incorporating into condition expressions or item keys.
- Leverage strongly typed builders for DynamoDB inputs (e.g.,
PutItemInput) rather than rawserde_json::Valuemerging.