A Go-first database migration tool with a Django-style workflow. Define your schema in YAML, generate type-safe Go migration files, and run them in-process — no Go toolchain required at runtime, no compiled binary to ship.
How migrations run. Generated migration files are real
.gosource — your IDE,gopls, andgo vettreat them as ordinary Go. At runtimemakemigrations migratedoes not invokego build. Instead it loads each.gofile with yaegi, an embedded Go interpreter, and runs the migrations in the makemigrations process. The language inside migration files is "Go-like" — yaegi implements the Go spec but is not the officialgccompiler, so a few features (cgo, deep reflection, some generics edge cases) work differently. For migrations generated bymakemigrations makemigrationsthis is invisible; for hand-edited migrations that import third-party packages, see Extending the yaegi Symbol Map.
- 🔒 Type-safe at edit time: Migrations are real Go files — caught by your IDE and
go vetbefore they ever run - ⚡ No build step at runtime: yaegi interprets the
.gofiles directly; nogo build, no temporary binary, no GOWORK juggling - 🗄️ Database-agnostic schema: Write YAML once, deploy to PostgreSQL, MySQL, SQLite, or SQL Server
- 🔀 DAG-based ordering: Migrations form a dependency graph so parallel branches merge cleanly
- 🔄 Auto change detection: Diff YAML schemas, generate only what changed
⚠️ Safe destructive ops: Field removals, table drops, and renames require explicit review- 🛠 Optional standalone binary: The generated
migrations/directory is still a buildable Go module, so you cango buildit for IDE type-checking or as an escape hatch
go install github.com/ocomsoft/makemigrations@latestcd your-project
makemigrations initThis creates:
your-project/
└── migrations/
├── main.go ← optional fallback entry point (`go build` still works)
└── go.mod ← dedicated migrations module (used by your IDE / gopls)
schema/schema.yaml:
defaults:
postgresql:
new_uuid: gen_random_uuid()
mysql:
new_uuid: uuid()
sqlite:
new_uuid: (lower(hex(randomblob(16))))
tables:
- name: users
fields:
- name: id
type: uuid
primary_key: true
default: new_uuid
- name: email
type: varchar
length: 255
nullable: false
- name: created_at
type: timestamp
auto_create: true
- name: posts
fields:
- name: id
type: uuid
primary_key: true
default: new_uuid
- name: title
type: varchar
length: 200
nullable: false
- name: user_id
type: foreign_key
foreign_key:
table: users
on_delete: CASCADEThe defaults section maps symbolic names (like new_uuid) to database-specific SQL expressions. Fields reference them by name — makemigrations resolves the correct expression for each target database at migration time.
makemigrations makemigrations --name "initial"
# Creates: migrations/0001_initial.goexport DATABASE_URL="postgresql://user:pass@localhost/mydb"
makemigrations migrate upmakemigrations migrate interprets the migration files in-process using yaegi and runs the embedded migration App. No go build, no temporary binary.
# 1. Edit your YAML schema
vim schema/schema.yaml
# 2. Preview what will be generated
makemigrations makemigrations --dry-run
# 3. Generate the migration
makemigrations makemigrations --name "add user preferences"
# Creates: migrations/0004_add_user_preferences.go
# 4. Review the SQL before applying
makemigrations migrate showsql
# 5. Apply
makemigrations migrate up
# 6. Verify
makemigrations migrate statusmakemigrations migrate interprets the migration files in-process via yaegi and runs the embedded App. All arguments are forwarded:
makemigrations migrate up # apply all pending
makemigrations migrate up --to 0003_add_index # apply up to a specific migration
makemigrations migrate down # roll back one
makemigrations migrate down --steps 3 # roll back multiple
makemigrations migrate status # show applied / pending
makemigrations migrate showsql # print SQL without running it
makemigrations migrate fake 0001_initial # mark applied without running SQL
makemigrations migrate dag # show migration dependency graphyour-project/
├── schema/
│ └── schema.yaml ← your YAML schema definition
├── migrations/
│ ├── main.go ← binary entry point (generated by init)
│ ├── go.mod ← dedicated module (generated by init)
│ ├── 0001_initial.go ← migration files (generated by makemigrations)
│ ├── 0002_add_posts.go
│ └── 0003_add_index.go
├── go.mod
└── main.go
| Database | Status | Notes |
|---|---|---|
| PostgreSQL | ✅ Full | UUID, JSONB, arrays, advanced types |
| MySQL | ✅ Supported | JSON, AUTO_INCREMENT, InnoDB |
| SQLite | ✅ Supported | Simplified types, basic constraints |
| SQL Server | ✅ Supported | UNIQUEIDENTIFIER, NVARCHAR, BIT |
| Amazon Redshift | ✅ Provider ready | SUPER JSON, IDENTITY sequences |
| ClickHouse | ✅ Provider ready | MergeTree engine, Nullable types |
| TiDB | ✅ Provider ready | MySQL-compatible, distributed |
| Vertica | ✅ Provider ready | Columnar analytics |
| YDB (Yandex) | ✅ Provider ready | Optional, native JSON |
| Turso | ✅ Provider ready | Edge SQLite |
| StarRocks | ✅ Provider ready | MPP analytics, OLAP |
| Aurora DSQL | ✅ Provider ready | AWS serverless, PostgreSQL-compatible |
PostgreSQL has been tested against real database instances. All other providers have comprehensive unit tests but may need additional validation for production.
When a field removal, table drop, or rename is detected, makemigrations prompts before generating:
⚠ Destructive operation detected: field_removed on "users" (field: "legacy_col")
1) Generate — include SQL in migration
2) Review — include with // REVIEW comment
3) Omit — skip SQL; schema state still advances (SchemaOnly)
4) Exit — cancel migration generation
5) All — generate all remaining destructive ops without prompting
Choice [1-5]:
When two developers generate migrations concurrently the DAG develops branches:
0001_initial
├── 0002_add_messaging (developer A)
└── 0003_add_payments (developer B)
Resolve with a merge migration:
makemigrations makemigrations --merge
# Creates: migrations/0004_merge_0002_add_messaging_and_0003_add_payments.goIf the schema is already applied to your database, fake the historical migrations:
makemigrations migrate fake 0001_initial
makemigrations migrate fake 0002_add_phone
makemigrations migrate statusOr use the dedicated conversion command for more control:
makemigrations migrate-to-go --dir migrations/makemigrations migrate reads connection details from DATABASE_URL and DB_TYPE:
export DATABASE_URL="postgresql://user:pass@localhost/mydb"
export DB_TYPE=postgresql # optional, defaults to postgresqlIf you prefer the optional fallback path of compiling migrations/ into a standalone binary, edit migrations/main.go to read additional vars (DB_HOST, DB_PORT, DB_USER, …) — see the Manual Build Guide.
migrations/makemigrations.config.yaml:
database:
type: postgresql
migration:
directory: migrations
include_down_sql: true
output:
verbose: falseSee the Configuration Guide for complete options.
- Installation Guide
- Schema Format Guide — complete YAML schema reference
- Configuration Guide
- Manual Build Guide — GOWORK/GOTOOLCHAIN details for CI/CD
- Extending the yaegi Symbol Map — let interpreted migrations import third-party packages
| Command | Description |
|---|---|
| init | Bootstrap the migrations/ directory |
| makemigrations | Generate .go migration files from YAML schema |
| migrate | Run migrations in-process via the yaegi interpreter |
| migrate-to-go | Convert existing Goose SQL migrations to Go |
| struct2schema | Generate YAML schemas from Go structs |
| dump_sql | Preview generated SQL from schemas |
| db2schema | Reverse-engineer schema from existing database |
For projects that predate the Go migration framework, the original YAML→SQL→Goose workflow is still supported:
makemigrations init --sql # create SQL-based project
makemigrations makemigrations_sql # generate Goose-compatible SQL files
makemigrations goose up # apply via Goose- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Add tests for new functionality
- Ensure all tests pass:
go test ./... - Submit a pull request
MIT License — see LICENSE file for details.
Ready to get started? Run makemigrations init in your project directory.