5-minute tutorial
Migrate Microsoft SQL Server to Google BigQuery in 60 Seconds
Learn how to copy your Microsoft SQL Server data to Google BigQuery with a single command using ingestr - no code required.
What you'll learn
Prerequisites
- Python 3.8 or higher installed
- SQL Server instance running
- SQL Server Authentication or Windows Authentication configured
- TCP/IP protocol enabled in SQL Server Configuration Manager
- Firewall rules for port 1433
- Google Cloud project with BigQuery API enabled
- Service account with BigQuery Data Editor and Job User roles
- Downloaded service account JSON key file
- Dataset created in BigQuery (or permissions to create one)
Step 1: Install ingestr
Install ingestr in seconds using pip. Choose the method that works best for you:
Recommended: Using uv (fastest)
# Install uv first if you haven't already
pip install uv
# Run ingestr using uvx
uvx ingestr
Alternative: Global installation
# Install globally using uv
uv pip install --system ingestr
# Or using standard pip
pip install ingestr
Verify installation: Run ingestr --version
to confirm it's installed correctly.
Step 2: Your First Migration
Let's copy a table from Microsoft SQL Server to Google BigQuery. This example shows a complete, working command you can adapt to your needs.
Set up your connections
Microsoft SQL Server connection format:
mssql://username:password@host:port/database
Parameters:
- • username: SQL Server login
- • password: Login password
- • host: Server name or IP
- • port: Port number (default 1433)
- • database: Database name
- • encrypt: Use encryption (true/false)
- • trustServerCertificate: Trust certificate
Google BigQuery connection format:
bigquery://project-id?credentials_path=/path/to/service-account.json
Parameters:
- • project-id: Your Google Cloud project ID
- • credentials_path: Path to service account JSON key file
- • location: Optional dataset location (e.g., US, EU)
BigQuery Setup Required
Before running the command:
- Create a service account in Google Cloud Console
- Grant it BigQuery Data Editor and Job User roles
- Download the JSON key file
- Use the path to this file in your connection string
Run your first copy
Copy the entire users table from Microsoft SQL Server to Google BigQuery:
ingestr ingest \
--source-uri 'mssql://sa:MyPass123@localhost:1433/AdventureWorks' \
--source-table 'dbo.Customers' \
--dest-uri 'bigquery://my-project?credentials_path=/path/to/credentials.json' \
--dest-table 'raw.Customers'
What this does:
- • Connects to your Microsoft SQL Server database
- • Reads all data from the specified table
- • Creates the table in Google BigQuery if needed
- • Copies all rows to the destination
Command breakdown:
--source-uri
Your source database--source-table
Table to copy from--dest-uri
Your destination--dest-table
Where to write data
Step 3: Verify your data
After the migration completes, verify your data was copied correctly:
Check row count in Google BigQuery:
-- Run this in BigQuery console
SELECT COUNT(*) as row_count
FROM `raw.Customers`;
-- Check a sample of the data
SELECT *
FROM `raw.Customers`
LIMIT 10;
Advanced Patterns
Once you've mastered the basics, use these patterns for production workloads.
Only copy new or updated records since the last sync. Perfect for daily updates.
ingestr ingest \
--source-uri 'mssql://sa:MyPass123@localhost:1433/AdventureWorks' \
--source-table 'public.orders' \
--dest-uri 'bigquery://my-project?credentials_path=/path/to/credentials.json' \
--dest-table 'raw.orders' \
--incremental-strategy merge \
--incremental-key updated_at \
--primary-key order_id
How it works: The merge strategy updates existing rows and inserts new ones based on the primary key. Only rows where updated_at
has changed will be processed.
Common Use Cases
Ready-to-use commands for typical Microsoft SQL Server to Google BigQuery scenarios.
Daily Customer Data Sync
Keep your analytics warehouse updated with the latest customer information every night.
# Add this to your cron job or scheduler
ingestr ingest \
--source-uri 'mssql://sa:MyPass123@localhost:1433/AdventureWorks' \
--source-table 'public.customers' \
--dest-uri 'bigquery://my-project?credentials_path=/path/to/credentials.json' \
--dest-table 'analytics.customers' \
--incremental-strategy merge \
--incremental-key updated_at \
--primary-key customer_id
Historical Data Migration
One-time migration of all historical records to your data warehouse.
# One-time full table copy
ingestr ingest \
--source-uri 'mssql://sa:MyPass123@localhost:1433/AdventureWorks' \
--source-table 'public.transactions' \
--dest-uri 'bigquery://my-project?credentials_path=/path/to/credentials.json' \
--dest-table 'warehouse.transactions_historical'
Development Environment Sync
Copy production data to your development Google BigQuery instance (with sensitive data excluded).
# Copy sample data to development
ingestr ingest \
--source-uri 'mssql://sa:MyPass123@localhost:1433/AdventureWorks' \
--source-table 'public.products' \
--dest-uri 'bigquery://dev-project?credentials_path=/path/to/credentials.json' \
--dest-table 'dev.products' \
--limit 1000 # Only copy 1000 rows for testing
Troubleshooting Guide
Solutions to common issues when migrating from Microsoft SQL Server to Google BigQuery.
Connection refused or timeout errors
Check your connection details:
- Enable TCP/IP in SQL Server Configuration Manager
- Check SQL Server Browser service is running
- Verify Windows Firewall allows SQL Server
- Ensure Mixed Mode authentication if using SQL login
- Verify the project ID matches your Google Cloud project
- Ensure the service account has necessary permissions
- Check if BigQuery API is enabled in your project
- Confirm the credentials file path is absolute, not relative
Authentication failures
Common authentication issues:
- Enable TCP/IP in SQL Server Configuration Manager
- Check SQL Server Browser service is running
- Verify Windows Firewall allows SQL Server
- Ensure Mixed Mode authentication if using SQL login
- Verify the project ID matches your Google Cloud project
- Ensure the service account has necessary permissions
- Check if BigQuery API is enabled in your project
- Confirm the credentials file path is absolute, not relative
Schema or data type mismatches
Handling data type differences:
- ingestr automatically handles most type conversions
- Microsoft SQL Server: NVARCHAR for Unicode support
- Microsoft SQL Server: Hierarchyid for tree structures
- Microsoft SQL Server: Spatial data types for geographic data
- Microsoft SQL Server: XML and JSON support
- Google BigQuery: JSON/JSONB types become STRING (use JSON functions to query)
- Google BigQuery: Arrays are converted to JSON arrays
- Google BigQuery: TIMESTAMP types are preserved with timezone information
Performance issues with large tables
Optimize large data transfers:
- Use incremental loading to process data in chunks
- Run migrations during off-peak hours
- Split very large tables by date ranges using interval parameters
Ready to scale your data pipeline?
You've learned how to migrate data from Microsoft SQL Server to Google BigQuery with ingestr. For production workloads with monitoring, scheduling, and data quality checks, explore Bruin Cloud.