Skip to content

ocomsoft/makemigrations

Repository files navigation

makemigrations

A Go-first database migration tool with a Django-style workflow. Define your schema in YAML, generate type-safe Go migration files, and run them in-process — no Go toolchain required at runtime, no compiled binary to ship.

How migrations run. Generated migration files are real .go source — your IDE, gopls, and go vet treat them as ordinary Go. At runtime makemigrations migrate does not invoke go build. Instead it loads each .go file with yaegi, an embedded Go interpreter, and runs the migrations in the makemigrations process. The language inside migration files is "Go-like" — yaegi implements the Go spec but is not the official gc compiler, so a few features (cgo, deep reflection, some generics edge cases) work differently. For migrations generated by makemigrations makemigrations this is invisible; for hand-edited migrations that import third-party packages, see Extending the yaegi Symbol Map.

✨ Why Go Migrations?

  • 🔒 Type-safe at edit time: Migrations are real Go files — caught by your IDE and go vet before they ever run
  • No build step at runtime: yaegi interprets the .go files directly; no go build, no temporary binary, no GOWORK juggling
  • 🗄️ Database-agnostic schema: Write YAML once, deploy to PostgreSQL, MySQL, SQLite, or SQL Server
  • 🔀 DAG-based ordering: Migrations form a dependency graph so parallel branches merge cleanly
  • 🔄 Auto change detection: Diff YAML schemas, generate only what changed
  • ⚠️ Safe destructive ops: Field removals, table drops, and renames require explicit review
  • 🛠 Optional standalone binary: The generated migrations/ directory is still a buildable Go module, so you can go build it for IDE type-checking or as an escape hatch

🚀 Quick Start

1. Install

go install github.com/ocomsoft/makemigrations@latest

2. Initialise your project

cd your-project
makemigrations init

This creates:

your-project/
└── migrations/
    ├── main.go     ← optional fallback entry point (`go build` still works)
    └── go.mod      ← dedicated migrations module (used by your IDE / gopls)

3. Define your schema

schema/schema.yaml:

defaults:
  postgresql:
    new_uuid: gen_random_uuid()
  mysql:
    new_uuid: uuid()
  sqlite:
    new_uuid: (lower(hex(randomblob(16))))

tables:
  - name: users
    fields:
      - name: id
        type: uuid
        primary_key: true
        default: new_uuid
      - name: email
        type: varchar
        length: 255
        nullable: false
      - name: created_at
        type: timestamp
        auto_create: true

  - name: posts
    fields:
      - name: id
        type: uuid
        primary_key: true
        default: new_uuid
      - name: title
        type: varchar
        length: 200
        nullable: false
      - name: user_id
        type: foreign_key
        foreign_key:
          table: users
          on_delete: CASCADE

The defaults section maps symbolic names (like new_uuid) to database-specific SQL expressions. Fields reference them by name — makemigrations resolves the correct expression for each target database at migration time.

4. Generate your first migration

makemigrations makemigrations --name "initial"
# Creates: migrations/0001_initial.go

5. Apply to your database

export DATABASE_URL="postgresql://user:pass@localhost/mydb"
makemigrations migrate up

makemigrations migrate interprets the migration files in-process using yaegi and runs the embedded migration App. No go build, no temporary binary.


🔄 Day-to-Day Workflow

# 1. Edit your YAML schema
vim schema/schema.yaml

# 2. Preview what will be generated
makemigrations makemigrations --dry-run

# 3. Generate the migration
makemigrations makemigrations --name "add user preferences"
# Creates: migrations/0004_add_user_preferences.go

# 4. Review the SQL before applying
makemigrations migrate showsql

# 5. Apply
makemigrations migrate up

# 6. Verify
makemigrations migrate status

📋 migrate Subcommands

makemigrations migrate interprets the migration files in-process via yaegi and runs the embedded App. All arguments are forwarded:

makemigrations migrate up                       # apply all pending
makemigrations migrate up --to 0003_add_index   # apply up to a specific migration
makemigrations migrate down                     # roll back one
makemigrations migrate down --steps 3           # roll back multiple
makemigrations migrate status                   # show applied / pending
makemigrations migrate showsql                  # print SQL without running it
makemigrations migrate fake 0001_initial        # mark applied without running SQL
makemigrations migrate dag                      # show migration dependency graph

🏗️ Project Structure

your-project/
├── schema/
│   └── schema.yaml              ← your YAML schema definition
├── migrations/
│   ├── main.go                  ← binary entry point (generated by init)
│   ├── go.mod                   ← dedicated module (generated by init)
│   ├── 0001_initial.go          ← migration files (generated by makemigrations)
│   ├── 0002_add_posts.go
│   └── 0003_add_index.go
├── go.mod
└── main.go

🗄️ Database Support

Database Status Notes
PostgreSQL ✅ Full UUID, JSONB, arrays, advanced types
MySQL ✅ Supported JSON, AUTO_INCREMENT, InnoDB
SQLite ✅ Supported Simplified types, basic constraints
SQL Server ✅ Supported UNIQUEIDENTIFIER, NVARCHAR, BIT
Amazon Redshift ✅ Provider ready SUPER JSON, IDENTITY sequences
ClickHouse ✅ Provider ready MergeTree engine, Nullable types
TiDB ✅ Provider ready MySQL-compatible, distributed
Vertica ✅ Provider ready Columnar analytics
YDB (Yandex) ✅ Provider ready Optional, native JSON
Turso ✅ Provider ready Edge SQLite
StarRocks ✅ Provider ready MPP analytics, OLAP
Aurora DSQL ✅ Provider ready AWS serverless, PostgreSQL-compatible

PostgreSQL has been tested against real database instances. All other providers have comprehensive unit tests but may need additional validation for production.


⚠️ Destructive Operations

When a field removal, table drop, or rename is detected, makemigrations prompts before generating:

⚠  Destructive operation detected: field_removed on "users" (field: "legacy_col")
  1) Generate  — include SQL in migration
  2) Review    — include with // REVIEW comment
  3) Omit      — skip SQL; schema state still advances (SchemaOnly)
  4) Exit      — cancel migration generation
  5) All       — generate all remaining destructive ops without prompting
Choice [1-5]:

🔀 Branch & Merge

When two developers generate migrations concurrently the DAG develops branches:

0001_initial
├── 0002_add_messaging   (developer A)
└── 0003_add_payments    (developer B)

Resolve with a merge migration:

makemigrations makemigrations --merge
# Creates: migrations/0004_merge_0002_add_messaging_and_0003_add_payments.go

⬆️ Upgrading from Goose SQL migrations

If the schema is already applied to your database, fake the historical migrations:

makemigrations migrate fake 0001_initial
makemigrations migrate fake 0002_add_phone
makemigrations migrate status

Or use the dedicated conversion command for more control:

makemigrations migrate-to-go --dir migrations/

⚙️ Configuration

Database connection

makemigrations migrate reads connection details from DATABASE_URL and DB_TYPE:

export DATABASE_URL="postgresql://user:pass@localhost/mydb"
export DB_TYPE=postgresql   # optional, defaults to postgresql

If you prefer the optional fallback path of compiling migrations/ into a standalone binary, edit migrations/main.go to read additional vars (DB_HOST, DB_PORT, DB_USER, …) — see the Manual Build Guide.

Configuration file

migrations/makemigrations.config.yaml:

database:
  type: postgresql

migration:
  directory: migrations
  include_down_sql: true

output:
  verbose: false

See the Configuration Guide for complete options.


📖 Documentation

Guides

Command Reference

Command Description
init Bootstrap the migrations/ directory
makemigrations Generate .go migration files from YAML schema
migrate Run migrations in-process via the yaegi interpreter
migrate-to-go Convert existing Goose SQL migrations to Go
struct2schema Generate YAML schemas from Go structs
dump_sql Preview generated SQL from schemas
db2schema Reverse-engineer schema from existing database

Legacy SQL Workflow

For projects that predate the Go migration framework, the original YAML→SQL→Goose workflow is still supported:

makemigrations init --sql          # create SQL-based project
makemigrations makemigrations_sql  # generate Goose-compatible SQL files
makemigrations goose up            # apply via Goose

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/amazing-feature
  3. Add tests for new functionality
  4. Ensure all tests pass: go test ./...
  5. Submit a pull request

📄 License

MIT License — see LICENSE file for details.


Ready to get started? Run makemigrations init in your project directory.

About

Automate creating migrations into go code

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages