5-minute tutorial
Migrate Snowflake to MySQL in 60 Seconds
Learn how to copy your Snowflake data to MySQL with a single command using ingestr - no code required.
What you'll learn
Prerequisites
- Python 3.8 or higher installed
- Snowflake account with active warehouse
- User credentials with appropriate permissions
- Database and schema access rights
- Network policies allowing connections from your IP
- MySQL server running and accessible
- User with appropriate GRANT permissions
- Database exists or permission to create
- Network access to MySQL port
Step 1: Install ingestr
Install ingestr in seconds using pip. Choose the method that works best for you:
Recommended: Using uv (fastest)
# Install uv first if you haven't already
pip install uv
# Run ingestr using uvx
uvx ingestr
Alternative: Global installation
# Install globally using uv
uv pip install --system ingestr
# Or using standard pip
pip install ingestr
Verify installation: Run ingestr --version
to confirm it's installed correctly.
Step 2: Your First Migration
Let's copy a table from Snowflake to MySQL. This example shows a complete, working command you can adapt to your needs.
Set up your connections
Snowflake connection format:
snowflake://user:password@account/database/schema?warehouse=warehouse_name
Parameters:
- • user: Snowflake username
- • password: User password
- • account: Account identifier (including region)
- • database: Target database name
- • schema: Schema within the database
- • warehouse: Compute warehouse to use
- • role: Optional role to assume
MySQL connection format:
mysql://username:password@host:port/database
Parameters:
- • username: MySQL user
- • password: User password
- • host: Server hostname or IP
- • port: Server port (default 3306)
- • database: Database name
- • charset: Optional character set
BigQuery Setup Required
Before running the command:
- Create a service account in Google Cloud Console
- Grant it BigQuery Data Editor and Job User roles
- Download the JSON key file
- Use the path to this file in your connection string
Run your first copy
Copy the entire users table from Snowflake to MySQL:
ingestr ingest \
--source-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
--source-table 'raw_events' \
--dest-uri 'mysql://root:password@localhost:3306/myapp' \
--dest-table 'raw.raw_events'
What this does:
- • Connects to your Snowflake database
- • Reads all data from the specified table
- • Creates the table in MySQL if needed
- • Copies all rows to the destination
Command breakdown:
--source-uri
Your source database--source-table
Table to copy from--dest-uri
Your destination--dest-table
Where to write data
Step 3: Verify your data
After the migration completes, verify your data was copied correctly:
Check row count in MySQL:
-- Run this in MySQL
SELECT COUNT(*) as row_count
FROM raw.raw_events;
-- Check a sample of the data
SELECT *
FROM raw.raw_events
LIMIT 10;
Advanced Patterns
Once you've mastered the basics, use these patterns for production workloads.
Only copy new or updated records since the last sync. Perfect for daily updates.
ingestr ingest \
--source-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
--source-table 'public.orders' \
--dest-uri 'mysql://root:password@localhost:3306/myapp' \
--dest-table 'raw.orders' \
--incremental-strategy merge \
--incremental-key updated_at \
--primary-key order_id
How it works: The merge strategy updates existing rows and inserts new ones based on the primary key. Only rows where updated_at
has changed will be processed.
Common Use Cases
Ready-to-use commands for typical Snowflake to MySQL scenarios.
Daily Customer Data Sync
Keep your analytics warehouse updated with the latest customer information every night.
# Add this to your cron job or scheduler
ingestr ingest \
--source-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
--source-table 'public.customers' \
--dest-uri 'mysql://root:password@localhost:3306/myapp' \
--dest-table 'analytics.customers' \
--incremental-strategy merge \
--incremental-key updated_at \
--primary-key customer_id
Historical Data Migration
One-time migration of all historical records to your data warehouse.
# One-time full table copy
ingestr ingest \
--source-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
--source-table 'public.transactions' \
--dest-uri 'mysql://root:password@localhost:3306/myapp' \
--dest-table 'warehouse.transactions_historical'
Development Environment Sync
Copy production data to your development MySQL instance (with sensitive data excluded).
# Copy sample data to development
ingestr ingest \
--source-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
--source-table 'public.products' \
--dest-uri 'mysql://root:password@localhost:3306/myapp' \
--dest-table 'dev.products' \
--limit 1000 # Only copy 1000 rows for testing
Troubleshooting Guide
Solutions to common issues when migrating from Snowflake to MySQL.
Connection refused or timeout errors
Check your connection details:
- Verify account identifier includes region (e.g., xy12345.us-east-1)
- Check if warehouse is running and not suspended
- Ensure user has USAGE privilege on warehouse
- Confirm network policies allow your IP address
- Check bind-address in my.cnf (should not be 127.0.0.1 for remote)
- Verify user has permission from connecting host
- Ensure port 3306 is not blocked by firewall
- Test with mysql client to isolate issues
Authentication failures
Common authentication issues:
- Verify account identifier includes region (e.g., xy12345.us-east-1)
- Check if warehouse is running and not suspended
- Ensure user has USAGE privilege on warehouse
- Confirm network policies allow your IP address
- Check bind-address in my.cnf (should not be 127.0.0.1 for remote)
- Verify user has permission from connecting host
- Ensure port 3306 is not blocked by firewall
- Test with mysql client to isolate issues
Schema or data type mismatches
Handling data type differences:
- ingestr automatically handles most type conversions
- Snowflake: VARIANT type for semi-structured data
- Snowflake: ARRAY and OBJECT types for complex structures
- Snowflake: Automatic timezone conversion for TIMESTAMP_TZ
- MySQL: JSON type available in MySQL 5.7+
- MySQL: DATETIME vs TIMESTAMP timezone handling
- MySQL: Character set and collation settings
- MySQL: Strict mode affects data validation
Performance issues with large tables
Optimize large data transfers:
- Use incremental loading to process data in chunks
- Run migrations during off-peak hours
- Split very large tables by date ranges using interval parameters
Ready to scale your data pipeline?
You've learned how to migrate data from Snowflake to MySQL with ingestr. For production workloads with monitoring, scheduling, and data quality checks, explore Bruin Cloud.