GitHub Actions for CI/CD¶
GitHub Actions provides cloud-based CI/CD automation directly integrated with your repository. Workflows are defined as YAML files that execute when code is pushed, pull requests are created, or on schedule.
This guide covers practical patterns for implementing CI/CD pipelines using GitHub Actions for Python/Django applications.
Philosophy
GitHub Actions workflows should be simple, fast, and reliable. Every workflow run should be deterministic—the same code should produce the same result every time. Failures should be obvious and easy to debug.
Workflow Fundamentals¶
Workflow Structure¶
A GitHub Actions workflow consists of triggers, jobs, and steps:
name: Django CI/CD Pipeline
on:
push:
branches: [ master, develop ]
pull_request:
branches: [ master ]
workflow_dispatch: # Manual trigger
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.13'
- name: Install dependencies
run: pip install -r requirements.txt
- name: Run tests
run: pytest
Key Components:
Triggers (on):
- push: Runs when code is pushed to specified branches
- pull_request: Runs when PRs are opened or updated
- workflow_dispatch: Enables manual workflow execution
- schedule: Runs on cron schedule (e.g., nightly builds)
Jobs:
- Run in parallel by default
- Can depend on other jobs with needs
- Execute on specified runner (ubuntu-latest, macos-latest, self-hosted)
- Share data using artifacts
Steps:
- Execute sequentially within a job
- Use actions from marketplace (uses) or shell commands (run)
- Share environment via environment variables
- Can be conditionally executed
Workflow Organization¶
Single Workflow vs Multiple Workflows:
For small projects, a single workflow handles all CI/CD:
# .github/workflows/main.yml
jobs:
test: # Run tests
build: # Build containers
deploy: # Deploy to AWS
For larger projects, separate workflows for different concerns:
# .github/workflows/ci.yml - Run on every push
jobs:
lint:
test:
# .github/workflows/deploy.yml - Run on main branch only
jobs:
build:
deploy-staging:
deploy-production:
Workflow File Location:
- Must be in .github/workflows/ directory
- Files must have .yml or .yaml extension
- Names should be descriptive: ci.yml, deploy.yml, security.yml
Test Job Patterns¶
Basic Test Job¶
The test job validates code correctness before deployment:
jobs:
test:
runs-on: ubuntu-latest
env:
DJANGO_SETTINGS_MODULE: myproject.settings.ci
services:
postgres:
image: postgres:15
env:
POSTGRES_PASSWORD: testpassword
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.13'
cache: 'pip'
- name: Install system dependencies
run: |
sudo apt-get update
sudo apt-get install -y \
libpq-dev \
python3-dev \
build-essential
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements/requirements-dev.txt
- name: Run migrations
run: python manage.py migrate
- name: Run tests
run: |
pytest \
--cov=myproject \
--cov-report=xml \
--cov-report=term-missing \
-v
- name: Upload coverage
uses: codecov/codecov-action@v3
with:
files: ./coverage.xml
fail_ci_if_error: true
Key Patterns:
Service Containers: - Run databases and services during tests - Use health checks to ensure availability - Automatically cleaned up after job
Dependency Caching:
- cache: 'pip' caches pip packages automatically
- Reduces installation time from minutes to seconds
- Cache invalidated when requirements change
Test Environment:
- Set DJANGO_SETTINGS_MODULE to test configuration
- Use environment variables for secrets (passed via env)
- Services available at localhost:<port>
Matrix Testing¶
Test across multiple Python versions or dependencies:
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['3.11', '3.12', '3.13']
django-version: ['4.2', '5.0', '5.1']
exclude:
# Django 5.x requires Python 3.10+
- python-version: '3.9'
django-version: '5.0'
- python-version: '3.9'
django-version: '5.1'
fail-fast: false # Continue testing other combinations on failure
name: Test Python ${{ matrix.python-version }}, Django ${{ matrix.django-version }}
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Install Django ${{ matrix.django-version }}
run: |
pip install django~=${{ matrix.django-version }}
pip install -r requirements/requirements-dev.txt
- name: Run tests
run: pytest -v
Matrix Strategy:
- Tests all combinations (3 Python × 3 Django = 9 jobs)
- Use exclude to skip invalid combinations
- fail-fast: false ensures all combinations tested
- Each combination runs as separate job
When to Use Matrix Testing: - Libraries that support multiple framework versions - Applications that must run on different Python versions - Testing platform-specific behavior (Linux, macOS, Windows)
When to Avoid: - Simple applications with single Python/Django version - Significantly increases CI time and resource usage - Most Django apps only support latest Python + Django versions
Linting and Code Quality¶
Run linting before tests to fail fast:
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.13'
- name: Install pre-commit
run: pip install pre-commit
- name: Run pre-commit hooks
run: pre-commit run --all-files
Pre-commit Integration: - Runs same checks as local development - Ensures code style consistency - Includes ruff, black, mypy, etc. - Fast feedback (typically 1-2 minutes)
Alternative: Separate Linting Steps:
- name: Run ruff
run: ruff check .
- name: Run black
run: black --check .
- name: Run mypy
run: mypy myproject/
- name: Check imports
run: isort --check-only .
Build Job Patterns¶
Docker Build and Push¶
Build container images and push to Amazon ECR:
jobs:
build:
runs-on: ubuntu-latest
needs: test # Only build if tests pass
permissions:
id-token: write # Required for OIDC
contents: read
steps:
- uses: actions/checkout@v3
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: arn:aws:iam::123456789012:role/GitHubActionsRole
aws-region: us-east-1
- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
file: ./docker/Dockerfile.app
push: true
tags: |
${{ steps.login-ecr.outputs.registry }}/myapp:${{ github.sha }}
${{ steps.login-ecr.outputs.registry }}/myapp:latest
cache-from: type=gha
cache-to: type=gha,mode=max
build-args: |
PYTHON_VERSION=3.13
Key Patterns:
OIDC Authentication (Recommended): - No long-lived AWS credentials stored in GitHub - Temporary credentials issued per workflow run - Requires IAM role configured for GitHub OIDC - More secure than access key/secret key
Image Tagging:
- Tag with commit SHA for traceability: myapp:abc123
- Tag with latest for convenience: myapp:latest
- Optionally tag with branch: myapp:main
- Never use mutable tags for production deployments
Build Caching:
- cache-from: type=gha uses GitHub Actions cache
- cache-to: type=gha,mode=max writes cache for next run
- Dramatically speeds up builds (10+ minutes → 2 minutes)
- Cache automatically cleaned up by GitHub
Multi-Stage Build Example¶
Workflow for multi-stage Dockerfile:
- name: Build and push production image
uses: docker/build-push-action@v5
with:
context: .
file: ./docker/Dockerfile.app
target: production # Build only production stage
push: true
tags: ${{ steps.login-ecr.outputs.registry }}/myapp:${{ github.sha }}
cache-from: |
type=gha,scope=build-production
type=registry,ref=${{ steps.login-ecr.outputs.registry }}/myapp:latest
cache-to: type=gha,mode=max,scope=build-production
- name: Build test image (not pushed)
uses: docker/build-push-action@v5
with:
context: .
file: ./docker/Dockerfile.app
target: testing # Build only testing stage
push: false
load: true # Load into Docker for local use
tags: myapp-test:latest
cache-from: type=gha,scope=build-testing
cache-to: type=gha,mode=max,scope=build-testing
Multi-Stage Benefits: - Production image excludes dev dependencies - Testing stage includes pytest, coverage tools - Shared base layer cached across stages - Different optimization strategies per stage
AWS Credential Management with OIDC¶
OIDC Setup¶
OpenID Connect (OIDC) allows GitHub Actions to authenticate with AWS without storing credentials.
AWS IAM Configuration:
- Create OIDC Identity Provider:
- Provider URL:
https://token.actions.githubusercontent.com -
Audience:
sts.amazonaws.com -
Create IAM Role:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Federated": "arn:aws:iam::123456789012:oidc-provider/token.actions.githubusercontent.com"
},
"Action": "sts:AssumeRoleWithWebIdentity",
"Condition": {
"StringEquals": {
"token.actions.githubusercontent.com:aud": "sts.amazonaws.com"
},
"StringLike": {
"token.actions.githubusercontent.com:sub": "repo:yourorg/yourrepo:*"
}
}
}
]
}
- Attach Permissions Policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"ecr:GetAuthorizationToken",
"ecr:BatchCheckLayerAvailability",
"ecr:GetDownloadUrlForLayer",
"ecr:PutImage",
"ecr:InitiateLayerUpload",
"ecr:UploadLayerPart",
"ecr:CompleteLayerUpload"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"ecs:UpdateService",
"ecs:DescribeServices",
"ecs:DescribeTaskDefinition",
"ecs:RegisterTaskDefinition"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"codedeploy:CreateDeployment",
"codedeploy:GetDeployment",
"codedeploy:GetDeploymentConfig"
],
"Resource": "*"
}
]
}
GitHub Actions Configuration:
permissions:
id-token: write # Required for OIDC
contents: read # Required to checkout code
steps:
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: arn:aws:iam::123456789012:role/GitHubActionsRole
aws-region: us-east-1
role-session-name: GitHubActions-${{ github.run_id }}
Security Benefits: - No permanent credentials stored in GitHub - Credentials expire after workflow completes - Granular permissions per repository - AWS CloudTrail logs all actions - Easy to rotate (no secrets to update)
Alternative: Access Key Credentials¶
For simpler setups or legacy systems:
steps:
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
Store credentials in GitHub Secrets:
- Go to repository Settings → Secrets and variables → Actions
- Add AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
- Never commit credentials to code
- Rotate regularly (every 90 days)
OIDC Preferred
Use OIDC whenever possible. Access keys should only be used when OIDC is not available (e.g., AWS China regions, some third-party services).
Caching Strategies¶
Dependency Caching¶
Python Package Caching:
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.13'
cache: 'pip' # Automatic pip caching
- name: Install dependencies
run: pip install -r requirements.txt
Cache Key Logic: - Automatically keys on requirements file hash - Invalidates when requirements change - Shared across workflow runs - Restored in seconds vs minutes to install
Manual Caching:
- name: Cache Python dependencies
uses: actions/cache@v3
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('requirements/*.txt') }}
restore-keys: |
${{ runner.os }}-pip-
Docker Layer Caching¶
GitHub Actions Cache Backend:
- name: Build with cache
uses: docker/build-push-action@v5
with:
context: .
cache-from: type=gha
cache-to: type=gha,mode=max
How It Works:
- mode=max: Caches all layers (slower push, faster builds)
- mode=min: Caches only final layers (faster push, slower builds)
- Cache scope isolated per workflow
- 10 GB cache limit per repository
Registry Cache Backend:
- name: Build with registry cache
uses: docker/build-push-action@v5
with:
context: .
cache-from: |
type=registry,ref=myregistry/myapp:buildcache
type=registry,ref=myregistry/myapp:latest
cache-to: type=registry,ref=myregistry/myapp:buildcache,mode=max
Registry Cache Benefits: - Shared across workflows and repositories - No 10 GB limit - Persists indefinitely - Useful for self-hosted runners
Pre-commit Hook Caching¶
- name: Cache pre-commit hooks
uses: actions/cache@v3
with:
path: ~/.cache/pre-commit
key: pre-commit-${{ hashFiles('.pre-commit-config.yaml') }}
- name: Run pre-commit
run: pre-commit run --all-files
Cache Invalidation:
- Changes to .pre-commit-config.yaml invalidate cache
- Pre-commit environments reinstalled when config changes
- Reduces pre-commit runtime from 2-3 minutes to 30 seconds
Complete Workflow Example¶
Full CI/CD Pipeline¶
name: Django CI/CD Pipeline
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
workflow_dispatch:
env:
AWS_REGION: us-east-1
ECR_REGISTRY: 123456789012.dkr.ecr.us-east-1.amazonaws.com
ECR_REPOSITORY: myapp
jobs:
lint:
name: Code Quality Checks
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.13'
cache: 'pip'
- name: Install pre-commit
run: pip install pre-commit
- name: Cache pre-commit
uses: actions/cache@v3
with:
path: ~/.cache/pre-commit
key: pre-commit-${{ hashFiles('.pre-commit-config.yaml') }}
- name: Run pre-commit
run: pre-commit run --all-files
test:
name: Run Tests
runs-on: ubuntu-latest
needs: lint
env:
DJANGO_SETTINGS_MODULE: myproject.settings.ci
services:
postgres:
image: postgres:15
env:
POSTGRES_PASSWORD: testpassword
POSTGRES_DB: testdb
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
redis:
image: redis:7
options: >-
--health-cmd "redis-cli ping"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 6379:6379
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.13'
cache: 'pip'
- name: Install system dependencies
run: |
sudo apt-get update
sudo apt-get install -y \
libpq-dev \
python3-dev \
build-essential
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements/requirements-dev.txt
- name: Run Django checks
run: python manage.py check
- name: Run migrations
run: python manage.py migrate
- name: Run tests with coverage
run: |
pytest \
--cov=myproject \
--cov-report=xml \
--cov-report=term-missing \
--verbose \
--exitfirst
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
with:
files: ./coverage.xml
fail_ci_if_error: true
build:
name: Build and Push Docker Image
runs-on: ubuntu-latest
needs: test
if: github.event_name == 'push' && github.ref == 'refs/heads/master'
permissions:
id-token: write
contents: read
outputs:
image-tag: ${{ steps.meta.outputs.tags }}
steps:
- uses: actions/checkout@v3
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: arn:aws:iam::123456789012:role/GitHubActionsRole
aws-region: ${{ env.AWS_REGION }}
- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Docker meta
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.ECR_REGISTRY }}/${{ env.ECR_REPOSITORY }}
tags: |
type=sha,prefix={{branch}}-
type=ref,event=branch
type=semver,pattern={{version}}
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
file: ./docker/Dockerfile.app
target: production
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha,scope=build-production
cache-to: type=gha,mode=max,scope=build-production
build-args: |
PYTHON_VERSION=3.13
BUILD_DATE=${{ github.event.head_commit.timestamp }}
VCS_REF=${{ github.sha }}
deploy-staging:
name: Deploy to Staging
runs-on: ubuntu-latest
needs: build
if: github.event_name == 'push' && github.ref == 'refs/heads/master'
environment:
name: staging
url: https://staging.example.com
permissions:
id-token: write
contents: read
steps:
- uses: actions/checkout@v3
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: arn:aws:iam::123456789012:role/GitHubActionsRole
aws-region: ${{ env.AWS_REGION }}
- name: Update ECS service
run: |
aws ecs update-service \
--cluster staging-cluster \
--service myapp-service \
--force-new-deployment \
--no-cli-pager
- name: Wait for deployment
run: |
aws ecs wait services-stable \
--cluster staging-cluster \
--services myapp-service
deploy-production:
name: Deploy to Production
runs-on: ubuntu-latest
needs: deploy-staging
if: github.event_name == 'workflow_dispatch'
environment:
name: production
url: https://example.com
permissions:
id-token: write
contents: read
steps:
- uses: actions/checkout@v3
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: arn:aws:iam::123456789012:role/GitHubActionsRole
aws-region: ${{ env.AWS_REGION }}
- name: Create CodeDeploy deployment
run: |
aws deploy create-deployment \
--application-name myapp \
--deployment-group-name production \
--deployment-config-name CodeDeployDefault.ECSAllAtOnce \
--description "Deployment from GitHub Actions - ${{ github.sha }}" \
--no-cli-pager
Workflow Characteristics:
Job Dependencies:
- lint runs first (fastest feedback)
- test runs after lint passes
- build only on master branch after tests pass
- deploy-staging automatic on master
- deploy-production requires manual trigger
Environment Protection: - Staging deploys automatically - Production requires approval - URLs linked in GitHub UI - Deployment history tracked
Workflow Optimization¶
Parallel Execution¶
Run independent jobs in parallel:
jobs:
lint:
runs-on: ubuntu-latest
steps: [...]
test-unit:
runs-on: ubuntu-latest
steps: [...]
test-integration:
runs-on: ubuntu-latest
steps: [...]
security-scan:
runs-on: ubuntu-latest
steps: [...]
# All above jobs run in parallel
build:
runs-on: ubuntu-latest
needs: [lint, test-unit, test-integration, security-scan]
steps: [...]
Total runtime = longest job, not sum of all jobs
Conditional Execution¶
Skip unnecessary steps:
- name: Deploy to production
if: github.ref == 'refs/heads/master' && github.event_name == 'push'
run: ./deploy.sh production
- name: Comment on PR
if: github.event_name == 'pull_request'
uses: actions/github-script@v6
with:
script: |
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: 'Tests passed! ✅'
})
Workflow Concurrency¶
Prevent multiple deployments simultaneously:
concurrency:
group: deploy-${{ github.ref }}
cancel-in-progress: false # Queue deployments instead of canceling
Secrets and Environment Variables¶
GitHub Secrets¶
Store sensitive data in GitHub Secrets:
Repository Secrets: - Settings → Secrets and variables → Actions - Available to all workflows in repository - Masked in logs automatically
Environment Secrets: - Settings → Environments → (environment name) → Secrets - Only available when deploying to that environment - Enables different credentials per environment
Organization Secrets: - Organization Settings → Secrets → Actions - Shared across all repositories in organization - Useful for common credentials (NPM tokens, Docker registry)
Using Secrets in Workflows¶
steps:
- name: Use secret
run: |
echo "Secret value is hidden in logs"
curl -H "Authorization: Bearer ${{ secrets.API_TOKEN }}" https://api.example.com
env:
API_TOKEN: ${{ secrets.API_TOKEN }}
Security Best Practices: - Never echo secrets directly - Use environment variables instead of inline - Secrets are masked in logs (shown as ***) - Audit secret access in organization settings
Environment Variables¶
Workflow-level:
Job-level:
Step-level:
Troubleshooting Workflows¶
Debugging Failed Workflows¶
Enable Debug Logging:
Set repository secrets:
- ACTIONS_STEP_DEBUG: true (detailed step logs)
- ACTIONS_RUNNER_DEBUG: true (runner diagnostic logs)
Re-run with Debug Logging: - Go to failed workflow run - Click "Re-run jobs" → "Re-run all jobs" - Select "Enable debug logging"
Common Failure Patterns:
Tests pass locally, fail in CI: - Check service container configuration - Verify environment variables set correctly - Ensure database migrations run before tests - Check timezone/locale differences
Docker build fails: - Check Dockerfile syntax - Verify base image available - Ensure build context includes necessary files - Check for file permission issues
AWS authentication fails: - Verify OIDC role ARN is correct - Check trust policy allows GitHub Actions - Ensure permissions policy includes required actions - Verify region matches between role and workflow
Workflow Testing¶
Test Locally with act:
# Install act (GitHub Actions local runner)
brew install act # macOS
# or
curl https://raw.githubusercontent.com/nektos/act/master/install.sh | sudo bash
# Run workflow locally
act push
# Run specific job
act -j test
# Use specific event
act pull_request
Limitations of Local Testing: - Cannot test OIDC authentication - Some actions don't work locally - Service containers may behave differently - Useful for syntax checking and basic testing
Next Steps¶
After implementing GitHub Actions workflows:
- Container Building: Deep dive into Docker optimization and multi-stage builds
- ECS Deployment: Learn AWS ECS, Fargate, and CodeDeploy integration
- Monitoring: Set up CloudWatch alarms and deployment monitoring
- Secrets Management: Implement secure credential handling
Start Simple, Iterate
Begin with basic CI (lint + test), then add container builds, then deployments. Test each addition thoroughly before moving to the next step.
Related Resources¶
Internal Documentation: - CI/CD Overview: Pipeline philosophy and architecture - Environment Variables: Managing configuration - AWS SSM Parameters: Storing secrets in AWS - Docker Development: Container development patterns
External Resources: - GitHub Actions Documentation - GitHub Actions Marketplace - Workflow Syntax Reference - AWS Actions - Docker Build Push Action