HIGH unicode normalizationaxumapi keys

Unicode Normalization in Axum with Api Keys

Unicode Normalization in Axum with Api Keys — how this specific combination creates or exposes the vulnerability

Unicode normalization inconsistencies can create security risks in Axum applications that use API keys for authentication. When an API key is accepted as user-controlled input and processed without normalization, equivalent strings that look identical can map to different byte representations. This can lead to bypasses where an attacker supplies a canonically equivalent key that passes normalization or comparison logic but differs in binary form from the stored key.

In Axum, route extraction and middleware commonly read headers or path parameters to obtain API keys. If normalization is applied inconsistently between storage and comparison — for example, storing keys in NFC form but not normalizing incoming values before comparison — attackers can exploit canonical variants to gain access. For instance, composed characters (é as U+00E9) and decomposed sequences (e followed combining acute accent) are visually identical but distinct code points. Without normalization, an API key containing such characters may be accepted as valid when it should be rejected or matched correctly only after normalization.

Additionally, normalization issues can interact with logging and observability. If the raw, unnormalized key is logged or compared in error messages, it may expose subtle behavior differences to an attacker probing authentication paths. In a security scan using middleBrick, such inconsistencies appear as findings in the Input Validation and Authentication checks, highlighting that unauthenticated or poorly normalized endpoints can permit unexpected access when equivalent keys are used.

Because middleBrick scans the unauthenticated attack surface and runs 12 security checks in parallel — including Authentication and Input Validation — it can detect these normalization-related discrepancies without credentials. Findings include guidance to normalize both stored and runtime values to a consistent Unicode form, typically NFC, to eliminate equivalence-based bypasses.

Api Keys-Specific Remediation in Axum — concrete code fixes

To secure API key handling in Axum, normalize all incoming key values before comparison or storage. Use a well-tested Unicode normalization library to ensure consistent binary representation. Below is a complete, syntactically correct Axum example that demonstrates normalization using the unicode-normalization crate and secure API key validation.

use axum::{
    async_trait,
    extract::{FromRequest, Request},
    http::Request as HttpRequest,
};
use std::convert::Infallable;
use unicode_normalization::UnicodeNormalization;

/// A compiled API key stored in normalized NFC form.
const STORED_API_KEY: &str = "café".nfc().collect::<String>(); // normalized

/// A token extracted from a request header.
struct ApiKey(String);

#[async_trait]
impl FromRequest for ApiKey {
    type Rejection = (axum::http::StatusCode, &'static str);

    async fn from_request(req: Request, _: &axum::extract::State<()>) -> Result<Self, Self::Rejection> {
        let header_value = req.headers()
            .get("X-API-Key")
            .ok_or((axum::http::StatusCode::UNAUTHORIZED, "Missing API key"))?;
        let raw = header_value.to_str().map_err(|_| (axum::http::StatusCode::UNAUTHORIZED, "Invalid header"))?;

        // Normalize incoming key to NFC before comparison.
        let normalized = raw.nfc().collect::<String>();
        if subtle::ConstantTimeEq::ct_eq(STORED_API_KEY.as_bytes(), normalized.as_bytes()).into() {
            Ok(ApiKey(normalized))
        } else {
            Err((axum::http::StatusCode::UNAUTHORIZED, "Invalid API key"))
        }
    }
}

async fn handler(key: ApiKey) -> String {
    format!("Authenticated with key: {}", key.0)
}

#[tokio::main]
async fn main() {
    let app = axum::Router::new().route("/secure", axum::routing::get(handler)).layer(axum::Extension(()));
    axum::Server::bind("0.0.0.0:3000".parse().unwrap()).serve(axum::service::make_service_fn(|_| async { Ok(axum::service::service_fn(app)) })).await.unwrap();
}

This approach ensures that both the stored key and the runtime value are in the same Unicode normalization form, mitigating bypass via canonical variants. It also uses constant-time comparison to reduce timing side channels. middleBrick can validate that such normalization is applied by testing endpoints with equivalent but differently encoded keys as part of its Input Validation and Authentication checks.

Additional remediation steps include avoiding normalization-sensitive operations on raw key bytes, ensuring consistent normalization in logging (log the normalized form), and documenting expected encoding for API consumers. In the Pro plan, continuous monitoring can alert if new endpoints introduce inconsistent handling, and the GitHub Action can fail builds when scans detect normalization-related findings.

Frequently Asked Questions

Why does Unicode normalization matter for API key security in Axum?
Normalization matters because visually identical strings can have different binary representations. If stored and runtime keys are not normalized to the same form (e.g., NFC), attackers can bypass authentication using canonical variants, leading to unauthorized access.
Can middleBrick detect Unicode normalization issues in Axum API key handling?
Yes. middleBrick runs parallel checks including Authentication and Input Validation and can identify inconsistencies that enable bypass via equivalent but differently encoded keys. Findings include remediation guidance to normalize consistently.