Database Migrations
This guide covers creating, reviewing, and managing database schema changes using Alembic.
What is Alembic?
Alembic is a lightweight database migration tool for SQLAlchemy. It tracks schema changes and applies them to different environments.
Key concepts: - Migration — A Python file describing schema changes (up and down) - Revision — A unique identifier for each migration - Head — The latest migration version - Base — The initial empty schema
Viewing Migration History
Check your migration history:
alembic history --oneline
# 1234567890ab -> 2345678901bc (head), Added patients table
# 0123456789ab -> 1234567890ab, Create users table
# <base> -> 0123456789ab, Initial migration
Check current revision:
alembic current
# 2345678901bc (head)
Check pending migrations:
alembic upgrade --sql head
# Shows SQL for all pending migrations
Creating a New Migration
There are two approaches to creating migrations: autogenerate and manual.
Approach 1: Autogenerate (Recommended)
First, update your SQLAlchemy models in app/models/:
# app/models/patient.py
from sqlalchemy import Column, String, Integer, DateTime, Boolean
from app.database import Base
import datetime
class Patient(Base):
__tablename__ = "patients"
id = Column(Integer, primary_key=True)
email = Column(String, unique=True, nullable=False, index=True)
first_name = Column(String, nullable=False)
last_name = Column(String, nullable=False)
date_of_birth = Column(DateTime)
is_active = Column(Boolean, default=True)
created_at = Column(DateTime, default=datetime.datetime.utcnow, nullable=False)
updated_at = Column(DateTime, default=datetime.datetime.utcnow, onupdate=datetime.datetime.utcnow)
Then autogenerate a migration:
alembic revision --autogenerate -m "Add patients table"
# Creating /app/alembic/versions/1234567890ab_add_patients_table.py
Alembic compares your models to the database schema and generates the migration.
Approach 2: Manual Migration
For complex changes, create an empty migration:
alembic revision -m "Rename users to accounts"
# Creating /app/alembic/versions/1234567890ab_rename_users_to_accounts.py
Edit the generated file:
# alembic/versions/1234567890ab_rename_users_to_accounts.py
from alembic import op
import sqlalchemy as sa
revision = '1234567890ab'
down_revision = '0123456789ab'
branch_labels = None
depends_on = None
def upgrade():
# Rename table
op.rename_table('users', 'accounts')
def downgrade():
# Rename back
op.rename_table('accounts', 'users')
Reviewing a Migration
Always review the generated migration file before applying it:
cat alembic/versions/1234567890ab_add_patients_table.py
Check the generated SQL:
alembic upgrade --sql head | head -50
Look for: - ✅ Correct table names - ✅ Correct column names and types - ✅ Appropriate constraints and indexes - ✅ Foreign keys are correctly defined - ✅ Downgrade is reversible
Example Migration File
# alembic/versions/1234567890ab_add_patients_table.py
from alembic import op
import sqlalchemy as sa
revision = '1234567890ab'
down_revision = '0123456789ab'
branch_labels = None
depends_on = None
def upgrade():
# Create new table
op.create_table(
'patients',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('email', sa.String(), nullable=False),
sa.Column('first_name', sa.String(), nullable=False),
sa.Column('last_name', sa.String(), nullable=False),
sa.Column('date_of_birth', sa.DateTime()),
sa.Column('is_active', sa.Boolean(), server_default='true'),
sa.Column('created_at', sa.DateTime(), nullable=False),
sa.Column('updated_at', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('email')
)
# Create index for faster lookups
op.create_index('ix_patients_email', 'patients', ['email'])
def downgrade():
# Drop index first
op.drop_index('ix_patients_email', table_name='patients')
# Drop table
op.drop_table('patients')
Running Migrations Locally
Apply all pending migrations:
alembic upgrade head
# INFO [alembic.runtime.migration] Running upgrade 1234567890ab -> 2345678901bc, Added patients table
# INFO [alembic.runtime.migration] Running upgrade 2345678901bc -> 3456789012cd, Add patient status column
Verify the upgrade:
alembic current
# 3456789012cd (head)
Check tables in your local database:
psql -h localhost -U menotime -d menotime -c "\dt"
# List of relations
# Schema | Name | Type | Owner
# --------+-----------+-------+-----------
# public | alembic_version | table | menotime
# public | patients | table | menotime
# public | users | table | menotime
Testing Migrations
Before committing, test both upgrade and downgrade:
# Test upgrade
alembic upgrade head
# Test downgrade (go back one version)
alembic downgrade -1
# Test upgrade again
alembic upgrade head
Create a test to verify schema changes:
# tests/test_migrations.py
import pytest
from sqlalchemy import inspect
from app.database import engine
def test_patients_table_exists():
"""Verify patients table was created with correct columns."""
inspector = inspect(engine)
tables = inspector.get_table_names()
assert 'patients' in tables
columns = inspector.get_columns('patients')
column_names = [col['name'] for col in columns]
assert 'id' in column_names
assert 'email' in column_names
assert 'first_name' in column_names
assert 'is_active' in column_names
def test_email_unique_constraint():
"""Verify email column has unique constraint."""
inspector = inspect(engine)
constraints = inspector.get_unique_constraints('patients')
constraint_columns = [c['column_names'] for c in constraints]
assert ['email'] in constraint_columns
Run tests to verify:
pytest tests/test_migrations.py -v
Running Migrations in Staging/Production
Via GitHub Actions (Automated)
Migrations run automatically during deployment:
- New image is built with migration files
- ECS task runs migrations before starting the API
- Uses a special
migrationcontainer command - Only proceeds if migrations succeed
Via ECS Exec (Manual)
If you need to run migrations manually:
# Get a running task ID
TASK_ID=$(aws ecs list-tasks \
--cluster staging \
--service-name menotime-api \
--region us-west-1 \
--query 'taskArns[0]' \
--output text | awk -F'/' '{print $NF}')
# Exec into the container
aws ecs execute-command \
--cluster staging \
--task $TASK_ID \
--container menotime-api \
--interactive \
--command "/bin/bash" \
--region us-west-1
# Inside the container
alembic current
alembic upgrade head
alembic current
exit
Via Bastion Host
If ECS exec is unavailable:
# SSH into bastion
ssh ec2-user@bastion.menotime-app.com
# Connect to RDS database
psql -h menotime-staging.xxxxx.us-west-1.rds.amazonaws.com \
-U menotime_admin \
-d menotime
# View schema
\dt
\d patients
Rolling Back a Migration
Downgrade One Version
alembic downgrade -1
# INFO [alembic.runtime.migration] Running downgrade 3456789012cd -> 2345678901bc, Add patient status column
Downgrade to Specific Revision
alembic downgrade 2345678901bc
# Downgrades all versions after 2345678901bc
Downgrade All Migrations
alembic downgrade base
# Removes all tables (use with caution!)
Verify the downgrade:
alembic current
Common Migration Patterns
Adding a New Column
def upgrade():
op.add_column('patients', sa.Column('phone_number', sa.String(), nullable=True))
def downgrade():
op.drop_column('patients', 'phone_number')
Making a Column Not Nullable
def upgrade():
# First, set default for existing rows
op.execute("UPDATE patients SET phone_number = '' WHERE phone_number IS NULL")
# Then, add the NOT NULL constraint
op.alter_column('patients', 'phone_number', existing_type=sa.String(), nullable=False)
def downgrade():
op.alter_column('patients', 'phone_number', existing_type=sa.String(), nullable=True)
Creating an Index
def upgrade():
op.create_index('ix_patients_email', 'patients', ['email'], unique=True)
def downgrade():
op.drop_index('ix_patients_email', table_name='patients')
Adding a Foreign Key
def upgrade():
op.create_table(
'patient_symptoms',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('patient_id', sa.Integer(), nullable=False),
sa.Column('symptom_name', sa.String(), nullable=False),
sa.ForeignKeyConstraint(['patient_id'], ['patients.id']),
sa.PrimaryKeyConstraint('id')
)
def downgrade():
op.drop_table('patient_symptoms')
Renaming a Column
def upgrade():
op.alter_column('patients', 'phone', new_column_name='phone_number')
def downgrade():
op.alter_column('patients', 'phone_number', new_column_name='phone')
Common Pitfalls
❌ Autogenerate Missing Changes
Autogenerate doesn't detect: - Constraint changes - Column comment changes - Some advanced types
Solution: Review the generated migration and add missing changes manually.
❌ Modifying Production Data
Don't modify data in migrations unless explicitly necessary:
# ❌ DON'T DO THIS
def upgrade():
op.execute("UPDATE patients SET status = 'active'")
# ✅ DO THIS INSTEAD
# Handle data changes in the application code
❌ Complex Migrations Without Testing
Always test complex migrations:
# Create a backup before testing
pg_dump menotime > backup.sql
# Test migration
alembic upgrade head
# If it fails, restore
psql menotime < backup.sql
❌ Mixing Schema and Data Changes
Separate schema changes from data transformations:
# ❌ BAD
def upgrade():
op.add_column('patients', sa.Column('status', sa.String()))
op.execute("UPDATE patients SET status = 'active'")
# ✅ GOOD
# Alembic migration (schema only):
def upgrade():
op.add_column('patients', sa.Column('status', sa.String(), server_default='active'))
# Application code handles any data migration needed
❌ Non-Reversible Migrations
Always implement both upgrade() and downgrade():
# ❌ BAD
def upgrade():
op.drop_column('patients', 'legacy_field')
def downgrade():
pass # Can't restore dropped column!
# ✅ GOOD
def upgrade():
op.drop_column('patients', 'legacy_field')
def downgrade():
op.add_column('patients', sa.Column('legacy_field', sa.String()))
Debugging Migration Issues
See the SQL that will be executed
alembic upgrade --sql head
Verbose output during migration
alembic upgrade head --sql
Check migration status in the database
psql -h localhost -U menotime -d menotime -c "SELECT version_num FROM alembic_version;"
Reset migrations (local only!)
# ⚠️ WARNING: Only for local development!
alembic downgrade base
alembic upgrade head
Best Practices
- Write migrations before code — Plan schema changes
- Test thoroughly — Test both up and down
- Keep migrations small — One logical change per migration
- Review before merging — Check migrations in PR
- Document complex changes — Add comments explaining why
- Never modify applied migrations — Create new migrations instead
- Always write downgrade — Never leave
downgrade()empty - Test data migrations — Verify data is preserved correctly