Regex Dos in Axum with Api Keys
Regex Dos in Axum with Api Keys — how this specific combination creates or exposes the vulnerability
A Regexp Denial-of-Service (ReDoS) occurs when a regular expression accepts input that causes catastrophic backtracking, consuming excessive CPU for crafted payloads. In Axum, combining custom extractor logic for API keys with permissive or complex regex patterns can expose this vulnerability. For example, an extractor that uses a pattern like (a+)+ to validate key segments can be forced into exponential backtracking by inputs such as aaaa...x where the final character prevents a full match, causing the engine to explore an exponential number of paths.
When API keys are validated via regex in Axum extractors, the risk is twofold: malformed keys from unauthenticated requests can trigger ReDoS on the router thread, and keys embedded in path or query parameters may be matched with greedy quantifiers without length constraints. Consider an extractor defined as:
use axum::extract::FromRequest;
use std::{future::Future, pin::Pin};
use regex::Regex;
#[derive(Debug)]
struct ApiKey(String);
#[async_trait::async_trait]
impl FromRequest<B> for ApiKey
where
B: Send + Sync,
{
type Rejection = (http::StatusCode, String);
type Future = Pin<Box<dyn Future<Output = Result<Self, Self::Rejection> + Send + 'static>>>;
fn from_request(req: &http::Request<B>) -> Self::Future {
let header_value = req.headers().get("X-API-KEY")
.and_then(|v| v.to_str().ok())
.unwrap_or("");
// Risky pattern: greedy quantifier with overlapping alternatives can cause backtracking
let pattern = Regex::new(r"^(?:[a-zA-Z0-9_-]{20,64})+$").unwrap();
if pattern.is_match(header_value) {
let key = ApiKey(header_value.to_string());
Box::pin(async move { Ok(key) })
} else {
let err = (http::StatusCode::UNAUTHORIZED, "Invalid API key".to_string());
Box::pin(async move { Err(err) })
}
}
}
If the pattern is made more permissive (for example, to allow mixed separators) without bounding repetition, an attacker can send keys that force the regex engine into exponential behavior. This manifests as high CPU on the handling route, effectively a denial of service for legitimate traffic. middleBrick scans include checks that flag unsafe regex constructs and excessive quantifier overlap, mapping findings to the OWASP API Top 10 and noting potential impact under Vulnerable and Outdated Components.
Additionally, if API keys are used to parameterize routes or queries without normalization, ambiguous routing patterns in Axum can compound ReDoS by increasing the number of regex evaluations per request. For instance, combining path parameters like /{key_id} with key-format validation in the same extractor may lead to multiple overlapping regex checks. middleBrick’s runtime analysis cross-references spec definitions and observed requests to highlight such combinations, helping teams prioritize fixes that reduce backtracking surface.
Api Keys-Specific Remediation in Axum — concrete code fixes
To mitigate Regexp Dos in Axum when handling API keys, prefer bounded, non-overlapping patterns and avoid nested quantifiers. Use explicit length limits and atomic groups where supported, and validate key structure before applying regex. Below are concrete, safe patterns and extractor implementations.
Safe regex patterns for API keys:
- Use non-overlapping, bounded quantifiers:
[a-zA-Z0-9_-]{20,64}instead of nested repeats like([a-zA-Z0-9_-]{5,})+. - Avoid ambiguous alternation that can cause exponential exploration; prefer explicit character classes and fixed-length segments.
- Pre-compile regex with
regex::Regex::builder()to set size limits and enable optimizations.
Example: bounded extractor with safe regex:
use axum::extract::FromRequest;
use http::{Request, StatusCode};
use regex::Regex;
use std::{future::Future, pin::Pin, sync::Arc};
#[derive(Debug, Clone)]
struct SafeApiKeyValidator {
pattern: Regex,
}
impl SafeApiKeyValidator {
fn new() -> Self {
// Bounded pattern, no nested quantifiers, no overlapping alternatives
let pattern = Regex::new(r"^[a-zA-Z0-9_-]{20,64}$").expect("valid regex");
Self { pattern }
}
fn validate(&self, key: &str) -> bool {
self.pattern.is_match(key)
}
}
#[derive(Debug, Clone)]
struct ApiKey(String);
impl FromRequest for ApiKey {
type Extension = Arc;
type Rejection = (StatusCode, String);
type Future = Pin<Box<dyn Future<Output = Result<Self, Self::Rejection> + Send + 'static>>>;
fn from_request(req: &Request<()>, ext: &Self::Extension) -> Self::Future {
let header_value = req.headers().get("X-API-KEY")
.and_then(|v| v.to_str().ok())
.unwrap_or("");
if ext.validate(header_value) {
let key = ApiKey(header_value.to_string());
Box::pin(async move { Ok(key) })
} else {
let err = (StatusCode::UNAUTHORIZED, "Invalid API key".to_string());
Box::pin(async move { Err(err) })
}
}
}
// In your router setup:
// let validator = Arc::new(SafeApiKeyValidator::new());
// let app = Router::new()
// .layer(Extension(validator))
// .route("/items", get(handler));
Additional hardening steps:
- Reject keys with known risky prefixes or suffixes (e.g., sequences that resemble regex meta-characters).
- Rate-limit validation attempts per source IP to reduce abuse potential.
- Use constant-time comparison when checking key equality after regex validation to avoid timing side channels.
middleBrick’s Pro plan supports continuous monitoring and CI/CD integration (e.g., GitHub Action) to enforce security thresholds and fail builds on risky patterns, helping teams catch ReDoS and related issues before deployment. Its findings map to compliance frameworks such as PCI-DSS and SOC2, providing remediation guidance rather than attempting to fix or block traffic directly.
Related CWEs: inputValidation
| CWE ID | Name | Severity |
|---|---|---|
| CWE-20 | Improper Input Validation | HIGH |
| CWE-22 | Path Traversal | HIGH |
| CWE-74 | Injection | CRITICAL |
| CWE-77 | Command Injection | CRITICAL |
| CWE-78 | OS Command Injection | CRITICAL |
| CWE-79 | Cross-site Scripting (XSS) | HIGH |
| CWE-89 | SQL Injection | CRITICAL |
| CWE-90 | LDAP Injection | HIGH |
| CWE-91 | XML Injection | HIGH |
| CWE-94 | Code Injection | CRITICAL |
Frequently Asked Questions
Why are nested quantifiers in regex patterns a risk for API key validation in Axum?
(a+)+) can cause catastrophic backtracking when the regex engine explores many paths to match an input. In Axum, if API key validation uses such patterns, crafted keys can consume excessive CPU, leading to a Regexp Denial-of-Service. Prefer bounded, non-overlapping patterns with explicit lengths to avoid this.How can I test my Axum API key regex for ReDoS before deploying?
middlebrick scan <url>) to include runtime detection of unsafe regex constructs as part of your CI/CD pipeline; the Pro plan enables automated scans on a schedule and fails builds if risk thresholds are exceeded.