Architecture
Migration Guide: Switching to Drizzle-First Architecture

Source: data_layer/docs/MIGRATION_TO_DRIZZLE_FIRST.md

Migration Guide: Switching to Drizzle-First Architecture

🎯 Overview

This guide helps you migrate from your current JSON Schema-first architecture to the new Drizzle-first architecture where Drizzle TypeScript schemas are the single source of truth.


πŸ“Š Current vs. New Architecture

Current (Before)

JSON Schema (manual editing)
    ↓
    β”œβ”€β”€ Pydantic Models (generated)
    β”œβ”€β”€ TypeScript Types (generated)
    β”œβ”€β”€ Drizzle Tables (generated)
    └── GraphQL SDL (generated)

New (After)

Drizzle Schema (TypeScript - manual editing)
    ↓
    β”œβ”€β”€ JSON Schema (generated)
    β”œβ”€β”€ Pydantic Models (generated)
    β”œβ”€β”€ SQLAlchemy Models (generated)
    β”œβ”€β”€ SQL DDL (generated)
    β”œβ”€β”€ GraphQL SDL (generated)
    └── Neo4j Cypher (generated)

πŸš€ Migration Steps

Phase 1: Set Up the New System

  1. Verify the generation script is working:

    python scripts/generate_from_drizzle.py --schema schemas/domain/drizzle/v1/examples/league_example.schema.ts
  2. Review the example output:

    ls -R schemas/generated/from_drizzle/
  3. Familiarize yourself with the example schema:

    cat schemas/domain/drizzle/v1/examples/league_example.schema.ts

Phase 2: Convert Existing JSON Schemas to Drizzle

For each JSON Schema file in schemas/domain/json/v1/, you'll need to create a corresponding Drizzle schema.

Example Conversion

Before (JSON Schema):

{
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "title": "User",
  "type": "object",
  "properties": {
    "id": {
      "type": "string",
      "format": "uuid"
    },
    "email": {
      "type": "string",
      "format": "email",
      "maxLength": 255
    },
    "name": {
      "type": "string",
      "maxLength": 100
    },
    "role": {
      "type": "string",
      "enum": ["admin", "user"],
      "default": "user"
    },
    "created_at": {
      "type": "string",
      "format": "date-time"
    }
  },
  "required": ["email", "name"]
}

After (Drizzle Schema):

// schemas/domain/drizzle/v1/users.schema.ts
import { pgTable, uuid, varchar, timestamp } from "drizzle-orm/pg-core";
 
/**
 * Users in the system
 */
export const users = pgTable("users", {
  id: uuid("id").primaryKey().defaultRandom(),
  email: varchar("email", { length: 255 }).notNull().unique(),
  name: varchar("name", { length: 100 }).notNull(),
  role: varchar("role", { length: 20 }).default("user").notNull(),
  created_at: timestamp("created_at").defaultNow().notNull(),
  updated_at: timestamp("updated_at").defaultNow().notNull(),
});

Conversion Helper Script

You can use this Python script to help convert JSON Schemas to Drizzle:

#!/usr/bin/env python3
"""Helper script to convert JSON Schema to Drizzle TypeScript"""
 
import json
import sys
from pathlib import Path
 
def json_schema_to_drizzle(json_schema_path: Path) -> str:
    """Convert JSON Schema to Drizzle TypeScript"""
    with open(json_schema_path) as f:
        schema = json.load(f)
    
    title = schema.get("title", json_schema_path.stem)
    table_name = title.lower().replace(" ", "_")
    properties = schema.get("properties", {})
    required = schema.get("required", [])
    
    # Generate import
    imports = ["pgTable"]
    type_set = set()
    
    for prop_name, prop_schema in properties.items():
        json_type = prop_schema.get("type", "string")
        
        if json_type == "string":
            if prop_schema.get("format") == "uuid":
                type_set.add("uuid")
            elif prop_schema.get("format") == "date-time":
                type_set.add("timestamp")
            elif "maxLength" in prop_schema:
                type_set.add("varchar")
            else:
                type_set.add("text")
        elif json_type == "integer":
            type_set.add("integer")
        elif json_type == "boolean":
            type_set.add("boolean")
        elif json_type == "object":
            type_set.add("jsonb")
    
    imports_str = "import { " + ", ".join(["pgTable"] + sorted(type_set)) + ' } from "drizzle-orm/pg-core";'
    
    # Generate columns
    columns = []
    for prop_name, prop_schema in properties.items():
        json_type = prop_schema.get("type", "string")
        col_def = f"  {prop_name}: "
        
        if json_type == "string":
            if prop_schema.get("format") == "uuid":
                col_def += f'uuid("{prop_name}")'
            elif prop_schema.get("format") == "date-time":
                col_def += f'timestamp("{prop_name}")'
            elif "maxLength" in prop_schema:
                length = prop_schema["maxLength"]
                col_def += f'varchar("{prop_name}", {{ length: {length} }})'
            else:
                col_def += f'text("{prop_name}")'
        elif json_type == "integer":
            col_def += f'integer("{prop_name}")'
        elif json_type == "boolean":
            col_def += f'boolean("{prop_name}")'
        elif json_type == "object":
            col_def += f'jsonb("{prop_name}")'
        
        # Add modifiers
        if prop_name in required:
            col_def += ".notNull()"
        
        if "default" in prop_schema:
            default_val = prop_schema["default"]
            if isinstance(default_val, str):
                col_def += f'.default("{default_val}")'
            elif isinstance(default_val, bool):
                col_def += f'.default({str(default_val).lower()})'
            else:
                col_def += f'.default({default_val})'
        
        columns.append(col_def + ",")
    
    columns_str = "\n".join(columns)
    
    # Generate final output
    output = f'''{imports_str}
 
/**
 * {title}
 */
export const {table_name} = pgTable("{table_name}", {{
{columns_str}
}});
'''
    
    return output
 
if __name__ == "__main__":
    if len(sys.argv) < 2:
        print("Usage: python convert_json_to_drizzle.py <json_schema_file>")
        sys.exit(1)
    
    json_path = Path(sys.argv[1])
    drizzle_code = json_schema_to_drizzle(json_path)
    print(drizzle_code)

Usage:

python convert_json_to_drizzle.py schemas/domain/json/v1/users.schema.json > schemas/domain/drizzle/v1/users.schema.ts

Phase 3: Organize Your Drizzle Schemas

Create a logical directory structure:

schemas/domain/drizzle/v1/
β”œβ”€β”€ core/                    # Core entities
β”‚   β”œβ”€β”€ users.schema.ts
β”‚   └── organizations.schema.ts
β”‚
β”œβ”€β”€ sports/                  # Sports domain
β”‚   β”œβ”€β”€ leagues.schema.ts
β”‚   β”œβ”€β”€ teams.schema.ts
β”‚   └── players.schema.ts
β”‚
β”œβ”€β”€ combat/                  # Combat sports vertical
β”‚   β”œβ”€β”€ fighters.schema.ts
β”‚   └── matches.schema.ts
β”‚
└── racing/                  # Racing vertical
    β”œβ”€β”€ drivers.schema.ts
    └── races.schema.ts

Phase 4: Generate New Schemas

Once you've created your Drizzle schemas:

# Generate all targets
python scripts/generate_from_drizzle.py
 
# Or generate specific targets
python scripts/generate_from_drizzle.py --target pydantic
python scripts/generate_from_drizzle.py --target sql

Phase 5: Update Your Code

Python Backend

Before:

from database.schemas.generated.adapters.python.v1.users import User

After:

from schemas.generated.from_drizzle.pydantic.users_model import Users, UsersCreate, UsersUpdate

TypeScript/Zod

Before:

import { UserSchema } from '@/database/schemas/generated/adapters/typescript/v1/users'

After:

// Use Drizzle schema directly in TypeScript backend
import { users } from '@/database/schemas/domain/drizzle/v1/users.schema'
 
// Or use generated types for frontend
import type { Users } from '@/database/schemas/generated/from_drizzle/typescript/users'

Phase 6: Update Your Workflow

New Development Workflow

  1. Create/Edit Drizzle Schema:

    vim schemas/domain/drizzle/v1/my_feature/my_table.schema.ts
  2. Generate All Targets:

    python scripts/generate_from_drizzle.py
  3. Apply SQL Migration:

    psql -d mydatabase -f schemas/generated/from_drizzle/sql/schema.sql
  4. Use Generated Code:

    from schemas.generated.from_drizzle.pydantic.my_table_model import MyTable
     
    @app.post("/my-table")
    async def create(data: MyTableCreate):
        # Automatic validation!
        return await db.insert(data)

πŸ”„ Backward Compatibility

During migration, you can maintain backward compatibility:

Option 1: Symlinks

# Create symlinks from old locations to new
ln -s schemas/generated/from_drizzle/pydantic/ database/schemas/generated/adapters/python/v1/

Option 2: Import Aliases

# In your code
from schemas.generated.from_drizzle.pydantic import users_model as legacy_users
Users = legacy_users.Users  # Keep old naming

Option 3: Gradual Migration

  • Keep both systems running in parallel
  • Migrate one domain at a time
  • Use feature flags to switch between old and new

πŸ“ Migration Checklist

For Each Schema:

  • Convert JSON Schema to Drizzle TypeScript
  • Add JSDoc comments to Drizzle schema
  • Define foreign keys with references()
  • Add indexes for frequently queried columns
  • Add created_at and updated_at timestamps
  • Use UUIDs for primary keys
  • Generate all targets
  • Update imports in Python code
  • Update imports in TypeScript code
  • Test Pydantic validation
  • Apply SQL migration to database
  • Verify GraphQL schema (if using)
  • Update Neo4j constraints (if using)
  • Update documentation
  • Commit changes

System-Wide:

  • Update CI/CD pipelines
  • Update development documentation
  • Update onboarding guides
  • Archive old JSON Schemas
  • Remove old generation scripts
  • Update .gitignore for new output paths
  • Update import paths project-wide
  • Run tests
  • Deploy

πŸ› Troubleshooting

Issue: "Columns not being parsed"

Solution: Make sure your Drizzle syntax is correct:

// βœ… Correct
id: uuid("id").primaryKey().defaultRandom(),
 
// ❌ Wrong (missing column name string)
id: uuid().primaryKey(),

Issue: "Foreign keys not working"

Solution: Ensure you're using arrow function syntax:

// βœ… Correct
league_id: uuid("league_id").references(() => leagues.id),
 
// ❌ Wrong (direct reference)
league_id: uuid("league_id").references(leagues.id),

Issue: "Generated Pydantic model has wrong types"

Solution: Check the Drizzle column type:

// For timestamps
created_at: timestamp("created_at").defaultNow().notNull(),
 
// For JSONB
metadata: jsonb("metadata").default({}),

Issue: "SQL migration fails"

Solution: Check for:

  • Circular foreign key dependencies (create tables in dependency order)
  • Missing referenced tables
  • Conflicting constraints

πŸ“š Resources


πŸ€” FAQ

Q: Can I use Drizzle for queries in my Python backend?
A: No, Drizzle is TypeScript-only. Use the generated SQLAlchemy or Pydantic models for Python.

Q: Do I need to keep the JSON Schemas?
A: No, Drizzle replaces them as the source of truth. JSON Schemas are now generated.

Q: What about my existing Prisma schemas?
A: You can convert them to Drizzle or continue using Prisma alongside Drizzle.

Q: Can I use Drizzle with Supabase?
A: Yes! Generate the SQL DDL and apply it to your Supabase database.

Q: How do I handle schema versioning?
A: Use the v1/, v2/ directory structure and generate to different output folders.

Q: What if I need custom generation logic?
A: Edit scripts/generate_from_drizzle.py and add your custom generator class.


βœ… Migration Complete!

Once you've migrated all schemas and updated your code:

  1. Remove old JSON Schema files
  2. Update documentation
  3. Celebrate! πŸŽ‰

You now have a single source of truth with automatic generation to all your target formats!


Need help? Check the Architecture Guide or review the example schema.

Platform

Documentation

Community

Support

partnership@altsportsdata.comdev@altsportsleagues.ai

2025 Β© AltSportsLeagues.ai. Powered by AI-driven sports business intelligence.

πŸ€– AI-Enhancedβ€’πŸ“Š Data-Drivenβ€’βš‘ Real-Time