How to Fix Python Docker Container Crash on Startup: ModuleNotFoundError and Entry Point Issues



Step 1: Understanding the Error


When your Dockerized Python application crashes immediately after starting, you'll typically see one of these error patterns in your container logs:

$ docker run my-python-app
Traceback (most recent call last):
  File "/app/main.py", line 2, in <module>
    import pandas as pd
ModuleNotFoundError: No module named 'pandas'


Or perhaps this startup failure:

$ docker run my-python-app
python: can't open file '/app/main.py': [Errno 2] No such file or directory


Sometimes the container exits silently with just an exit code:

$ docker ps -a
CONTAINER ID   IMAGE           COMMAND                  CREATED          STATUS                      PORTS     NAMES
a1b2c3d4e5f6   my-python-app   "python main.py"         5 seconds ago    Exited (1) 4 seconds ago              quirky_newton


These startup crashes happen when Docker can't find your Python modules, scripts, or has misconfigured entry points. The container starts, attempts to run your Python application, encounters an error, and immediately exits.


Step 2: Identifying the Cause


Docker container startup crashes in Python applications stem from several common misconfigurations. The build context might be correct, but the runtime environment fails due to missing dependencies, incorrect file paths, or improperly configured working directories.


Here's a problematic Dockerfile that reproduces the crash:

# Broken Dockerfile - This will crash on startup
FROM python:3.11-slim

# Missing WORKDIR directive causes path issues
COPY requirements.txt .
RUN pip install -r requirements.txt

# Files copied to root directory instead of /app
COPY . .

# Entry point references non-existent path
CMD ["python", "/app/main.py"]


The corresponding requirements.txt:

pandas==2.1.0
requests==2.31.0
numpy==1.24.3


And a sample main.py that will fail:

# main.py - This will crash due to import errors
import sys
import pandas as pd  # Module not found if pip install failed
from src.utils import process_data  # Path error if structure is wrong

def main():
    print(f"Python version: {sys.version}")
    data = pd.DataFrame({'test': [1, 2, 3]})
    process_data(data)
    
if __name__ == "__main__":
    main()


Running this configuration produces the startup crash:

$ docker build -t broken-app .
$ docker run broken-app
python: can't open file '/app/main.py': [Errno 2] No such file or directory


The container crashes because the CMD instruction looks for main.py in /app, but we never created that directory or set it as WORKDIR. Even if we fix the path issue, we might encounter module import errors if pip installation failed silently during the build.


Step 3: Implementing the Solution


The fix requires proper directory structure, dependency management, and error handling. Start by creating a robust Dockerfile with explicit paths and proper build stages:

# Fixed Dockerfile - Production-ready configuration
FROM python:3.11-slim

# Set working directory first - critical for path resolution
WORKDIR /app

# Copy and install dependencies separately for better caching
COPY requirements.txt .
# Add --no-cache-dir to prevent pip cache issues in container
RUN pip install --no-cache-dir -r requirements.txt

# Copy application code after dependencies
COPY . .

# Use explicit Python path and add error handling
CMD ["python", "-u", "main.py"]


The -u flag ensures Python runs in unbuffered mode, making logs appear immediately. This helps with debugging container crashes.


Update your project structure to match Docker's expectations:

project/
├── Dockerfile
├── requirements.txt
├── main.py
├── src/
│   ├── __init__.py
│   └── utils.py
└── .dockerignore


Create a .dockerignore file to prevent unnecessary files from bloating your image:

# .dockerignore - Exclude non-essential files
__pycache__
*.pyc
*.pyo
*.pyd
.Python
env/
venv/
.venv/
pip-log.txt
pip-delete-this-directory.txt
.tox/
.coverage
.coverage.*
.cache
.pytest_cache/
nosetests.xml
coverage.xml
*.cover
*.log
.git
.gitignore
.mypy_cache
.pytest_cache
.hypothesis


Implement error handling in your main.py to catch import issues:

# main.py - Robust entry point with error handling
import sys
import os
import traceback

def check_environment():
    """Verify runtime environment before main execution"""
    print(f"Python version: {sys.version}")
    print(f"Working directory: {os.getcwd()}")
    print(f"Directory contents: {os.listdir('.')}")
    
    # Check if src module is accessible
    if not os.path.exists('src'):
        print("WARNING: src directory not found")
        return False
    return True

try:
    # Import dependencies with error catching
    import pandas as pd
    import requests
    import numpy as np
    from src.utils import process_data
    
except ImportError as e:
    print(f"Import Error: {e}")
    print("Checking installed packages:")
    os.system("pip list")
    sys.exit(1)

def main():
    """Main application logic"""
    if not check_environment():
        print("Environment check failed")
        sys.exit(1)
    
    try:
        # Your application logic here
        data = pd.DataFrame({'values': [1, 2, 3, 4, 5]})
        print(f"DataFrame created: {data.shape}")
        
        result = process_data(data)
        print(f"Processing complete: {result}")
        
    except Exception as e:
        print(f"Runtime error: {e}")
        traceback.print_exc()
        sys.exit(1)

if __name__ == "__main__":
    print("Starting application...")
    main()
    print("Application completed successfully")


Create the supporting utils module:

# src/utils.py - Helper functions with error handling
import pandas as pd

def process_data(df):
    """Process DataFrame with validation"""
    if not isinstance(df, pd.DataFrame):
        raise ValueError("Input must be a pandas DataFrame")
    
    # Add error handling for data processing
    try:
        result = df.describe()
        return f"Processed {len(df)} rows successfully"
    except Exception as e:
        return f"Processing failed: {e}"


Don't forget the init.py file to make src a proper Python package:

# src/__init__.py - Makes src directory a Python package
# This file can be empty but must exist


Step 4: Working Code Example


Build and run the fixed container with comprehensive error checking:

# Build with detailed output
$ docker build -t python-app-fixed . --no-cache
[+] Building 15.2s (10/10) FINISHED
 => [internal] load build definition from Dockerfile
 => [internal] load .dockerignore
 => [internal] load metadata for docker.io/library/python:3.11-slim
 => [1/5] FROM docker.io/library/python:3.11-slim
 => [2/5] WORKDIR /app
 => [3/5] COPY requirements.txt .
 => [4/5] RUN pip install --no-cache-dir -r requirements.txt
 => [5/5] COPY . .
 => exporting to image
 => => naming to docker.io/library/python-app-fixed

# Run with interactive output
$ docker run --name test-app python-app-fixed
Starting application...
Python version: 3.11.5 (main, Aug 25 2023, 13:19:50) [GCC 12.2.0]
Working directory: /app
Directory contents: ['Dockerfile', 'main.py', 'requirements.txt', 'src', '.dockerignore']
DataFrame created: (5, 1)
Processing complete: Processed 5 rows successfully
Application completed successfully


For debugging persistent crashes, use these diagnostic commands:

# Check container logs after crash
$ docker logs test-app

# Run container with shell to inspect file system
$ docker run -it python-app-fixed /bin/bash
root@container:/app# ls -la
root@container:/app# python -c "import pandas; print(pandas.__version__)"
2.1.0

# Verify installed packages
$ docker run python-app-fixed pip list
Package         Version
--------------- -------
numpy           1.24.3
pandas          2.1.0
pip             23.2.1
python-dateutil 2.8.2
pytz            2023.3
requests        2.31.0
setuptools      68.2.0
six             1.16.0
urllib3         2.0.4
wheel           0.41.2


Step 5: Additional Tips & Related Errors


Multi-stage builds prevent dependency conflicts and reduce image size:

# Multi-stage Dockerfile for production
FROM python:3.11-slim as builder

WORKDIR /app
COPY requirements.txt .
RUN pip install --user --no-cache-dir -r requirements.txt

FROM python:3.11-slim

WORKDIR /app
# Copy only installed packages from builder
COPY --from=builder /root/.local /root/.local
COPY . .

# Ensure pip packages are in PATH
ENV PATH=/root/.local/bin:$PATH

CMD ["python", "-u", "main.py"]


Handle different Python versions and architecture-specific packages:

# Dockerfile with architecture handling
FROM --platform=linux/amd64 python:3.11-slim

WORKDIR /app

# Install system dependencies for packages like numpy/pandas
RUN apt-get update && apt-get install -y \
    gcc \
    g++ \
    && rm -rf /var/lib/apt/lists/*

COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY . .
CMD ["python", "-u", "main.py"]


Common related errors and their fixes:

# Error: exec format error
$ docker run python-app
exec /usr/local/bin/python: exec format error

# Fix: Rebuild for correct architecture
$ docker build --platform linux/amd64 -t python-app .
# Error: Permission denied
$ docker run python-app
python: can't open file '/app/main.py': [Errno 13] Permission denied

# Fix: Set proper file permissions
$ chmod +x main.py
$ docker build -t python-app .


Environment variable configuration for different deployment scenarios:

# main.py with environment configuration
import os

# Read configuration from environment
DEBUG = os.getenv('DEBUG', 'false').lower() == 'true'
DB_HOST = os.getenv('DB_HOST', 'localhost')
PORT = int(os.getenv('PORT', '8000'))

if DEBUG:
    print(f"Debug mode enabled")
    print(f"Environment variables: {dict(os.environ)}")


Run with environment variables:

$ docker run -e DEBUG=true -e DB_HOST=postgres -e PORT=5000 python-app-fixed


Docker Compose configuration for complex applications:

# docker-compose.yml
version: '3.8'
services:
  app:
    build: .
    environment:
      - DEBUG=true
      - PYTHONUNBUFFERED=1
    volumes:
      - ./src:/app/src  # Mount source for development
    command: python -u main.py


The PYTHONUNBUFFERED environment variable ensures all Python output appears immediately in Docker logs, making debugging much easier. Volume mounting allows live code changes during development without rebuilding the container.


How to Spot and Resolve Memory Leaks in Python AsyncIO