HIGH zip slipaws

Zip Slip on Aws

How Zip Slip Manifests in Aws

Zip Slip vulnerabilities in Aws environments typically occur when applications process user-supplied archive files without properly validating the paths of extracted contents. This attack allows malicious archives to write files outside intended directories, potentially overwriting critical system files, configuration files, or executing code in privileged contexts.

In Aws-specific scenarios, Zip Slip often appears in Lambda functions that process uploaded archives, ECS tasks that unpack deployment packages, or EC2 instances handling file uploads. The vulnerability is particularly dangerous in containerized Aws environments where attackers might overwrite entrypoint scripts or configuration files that execute during container startup.

Consider a typical Lambda function that processes uploaded zip files:

import zipfile
import os
import boto3

def lambda_handler(event, context):
    s3 = boto3.client('s3')
    bucket = event['bucket']
    key = event['key']
    
    # Download zip to /tmp (Lambda writable space)
    local_zip = '/tmp/uploaded.zip'
    s3.download_file(bucket, key, local_zip)
    
    # Extract without validation - ZIP SLIP VULNERABLE!
    with zipfile.ZipFile(local_zip, 'r') as zip_ref:
        zip_ref.extractall('/tmp/unpacked')
    
    return {'status': 'success'}

This code is vulnerable because extractall() doesn't validate member paths. An attacker could craft a zip containing ../../etc/passwd or ../../var/runtime/bootstrap, which would write files outside the intended directory.

In ECS environments, Zip Slip might manifest during deployment processes where container images are unpacked or when applications process configuration archives. For example, a Node.js application running in ECS that extracts user-provided tarballs could be exploited to overwrite the application's entrypoint script.

Aws-specific attack vectors include:

  • Lambda execution environment manipulation by overwriting runtime files in /var/runtime
  • ECS task manipulation by modifying entrypoint scripts or environment files
  • EC2 instance compromise through overwriting system binaries or configuration files
  • CodeBuild/CodePipeline injection during artifact processing
  • AWS SAM/CloudFormation template injection through malicious archive contents

Aws-Specific Detection

Detecting Zip Slip vulnerabilities in Aws environments requires both static analysis of code and dynamic scanning of running services. For Lambda functions, middleBrick's black-box scanning can identify vulnerable code patterns by analyzing the function's execution surface and testing for path traversal vulnerabilities.

middleBrick specifically tests for Zip Slip by:

  • Scanning Lambda functions for archive processing code that uses unsafe extraction methods
  • Testing API endpoints that accept file uploads for path traversal vulnerabilities
  • Analyzing OpenAPI specifications for endpoints that handle archive files
  • Checking for LLM/AI endpoints that might process archive contents containing malicious prompts
  • Running active exploitation attempts using controlled malicious archives to verify vulnerability presence

For manual detection, examine your Aws code for these patterns:

# Vulnerable patterns to search for:
zipfile.ZipFile.extractall()  # No path validation
tarfile.TarFile.extractall()  # No path validation
zipfile.ZipFile.extract()     # Individual extraction without validation

middleBrick's scanning methodology includes:

Check TypeAws-Specific TargetDetection Method
Static AnalysisLambda function codePattern matching for unsafe extraction methods
API ScanningAPI Gateway endpointsFile upload testing with malicious archives
Runtime AnalysisECS task definitionsConfiguration file validation
LLM SecurityAmazon Bedrock endpointsPrompt injection through archive contents

For comprehensive Aws security, integrate middleBrick's CLI into your deployment pipeline:

# Scan Lambda functions before deployment
middlebrick scan --target arn:aws:lambda:us-east-1:123456789012:function:my-function

# Scan API Gateway endpoints
middlebrick scan --target https://my-api-gateway.execute-api.us-east-1.amazonaws.com/prod

# Continuous monitoring with GitHub Action
- name: Aws API Security Scan
  uses: middlebrick/middlebrick-action@v1
  with:
    target: https://my-ecs-service.region.amazonaws.com
    fail-on-severity: high

Aws-Specific Remediation

Remediating Zip Slip vulnerabilities in Aws environments requires implementing path validation and using secure extraction methods. The Aws SDK and native Python libraries provide several approaches to safely handle archive files.

Safe Lambda implementation:

import zipfile
import os
import boto3
import pathlib

def safe_extract(zip_path, target_dir):
    """Securely extract zip file with path validation"""
    target_dir = pathlib.Path(target_dir).resolve()
    
    with zipfile.ZipFile(zip_path, 'r') as zip_ref:
        for member in zip_ref.namelist():
            # Resolve path and check if it's within target directory
            member_path = pathlib.Path(member)
            if member_path.is_absolute() or '..' in member_path.parts:
                raise ValueError(f"Path traversal attempt detected: {member}")
            
            # Construct full path
            full_path = (target_dir / member_path).resolve()
            
            # Verify path is still within target directory
            if not full_path.is_relative_to(target_dir):
                raise ValueError(f"Path traversal attempt detected: {full_path}")
            
            # Extract file
            zip_ref.extract(member, target_dir)

def lambda_handler(event, context):
    s3 = boto3.client('s3')
    bucket = event['bucket']
    key = event['key']
    
    local_zip = '/tmp/uploaded.zip'
    s3.download_file(bucket, key, local_zip)
    
    try:
        safe_extract(local_zip, '/tmp/unpacked')
        return {'status': 'success'}
    except ValueError as e:
        return {'status': 'error', 'message': str(e)}

For ECS and EC2 environments, consider these additional security measures:

  • Use Aws Secrets Manager or Parameter Store for configuration instead of file-based configs
  • Implement IAM roles with least privilege to limit impact of potential exploitation
  • Use read-only filesystem mounts where possible for archive processing directories
  • Implement content validation before processing archives (file type checking, size limits)

middleBrick's remediation guidance includes:

Risk LevelRemediation PriorityImplementation Complexity
CriticalImmediateHigh - requires code changes
HighWithin 24-48 hoursMedium - add validation layers
MediumThis weekLow - implement monitoring

For automated remediation workflows, use middleBrick's GitHub Action with security gates:

name: Aws Security Scan
on: [push, pull_request]

jobs:
  security-scan:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v4
    
    - name: Aws Lambda Security Scan
      uses: middlebrick/middlebrick-action@v1
      with:
        target: arn:aws:lambda:us-east-1:123456789012:function:my-function
        fail-on-severity: high
        
    - name: Aws API Gateway Scan
      uses: middlebrick/middlebrick-action@v1
      with:
        target: https://my-api-gateway.execute-api.us-east-1.amazonaws.com/prod
        fail-on-severity: medium

Frequently Asked Questions

Can Zip Slip vulnerabilities in Aws Lambda functions be exploited to access other Lambda functions?
No, Lambda functions run in isolated environments. However, Zip Slip can be used to modify the current function's runtime files, potentially allowing persistence or privilege escalation within that specific function's execution context. The isolation boundaries prevent cross-function exploitation.
Does middleBrick's LLM security scanning detect Zip Slip in AI/ML models deployed on Aws?
Yes, middleBrick's LLM security module specifically tests for archive-based attacks that could inject malicious prompts or training data into AI/ML models. It checks for system prompt leakage and prompt injection vulnerabilities that could be exploited through archive processing in Amazon Bedrock or SageMaker endpoints.