5-minute tutorial
Migrate X Ads to Snowflake in 60 Seconds
Learn how to copy your X Ads data to Snowflake with a single command using ingestr - no code required.
What you'll learn
Prerequisites
- Python 3.8 or higher installed
- X Ads account with API access
- Approved developer application
- OAuth 1.0a credentials
- Snowflake account with active warehouse
- User credentials with appropriate permissions
- Database and schema access rights
- Network policies allowing connections from your IP
Step 1: Install ingestr
Install ingestr in seconds using pip. Choose the method that works best for you:
Recommended: Using uv (fastest)
# Install uv first if you haven't already
pip install uv
# Run ingestr using uvx
uvx ingestrAlternative: Global installation
# Install globally using uv
uv pip install --system ingestr
# Or using standard pip
pip install ingestrVerify installation: Run ingestr --version to confirm it's installed correctly.
Step 2: Your First Migration
Let's copy a table from X Ads to Snowflake. This example shows a complete, working command you can adapt to your needs.
Set up your connections
X Ads connection format:
twitter-ads://api_key:api_secret@account_id?access_token=token&access_secret=secretParameters:
- • api_key: X API key
- • api_secret: X API secret
- • account_id: Ads account identifier
- • access_token: OAuth access token
Snowflake connection format:
snowflake://user:password@account/database/schema?warehouse=warehouse_nameParameters:
- • user: Snowflake username
- • password: User password
- • account: Account identifier (including region)
- • database: Target database name
- • schema: Schema within the database
- • warehouse: Compute warehouse to use
- • role: Optional role to assume
Snowflake connection format:
snowflake://user:password@account/database/schema?warehouse=warehouse_name Run your first copy
Copy the entire users table from X Ads to Snowflake:
ingestr ingest \
--source-uri 'twitter-ads://key:secret@acc_123?access_token=tok&access_secret=sec' \
--source-table 'campaigns' \
--dest-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
--dest-table 'raw.campaigns'What this does:
- • Connects to your X Ads database
- • Reads all data from the specified table
- • Creates the table in Snowflake if needed
- • Copies all rows to the destination
Command breakdown:
--source-uriYour source database--source-tableTable to copy from--dest-uriYour destination--dest-tableWhere to write data
Step 3: Verify your data
After the migration completes, verify your data was copied correctly:
Check row count in Snowflake:
-- Run this in Snowflake
SELECT COUNT(*) as row_count
FROM raw.campaigns;
-- Check a sample of the data
SELECT *
FROM raw.campaigns
LIMIT 10;Advanced Patterns
Once you've mastered the basics, use these patterns for production workloads.
Only copy new or updated records since the last sync. Perfect for daily updates.
ingestr ingest \
--source-uri 'twitter-ads://key:secret@acc_123?access_token=tok&access_secret=sec' \
--source-table 'public.orders' \
--dest-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
--dest-table 'raw.orders' \
--incremental-strategy merge \
--incremental-key updated_at \
--primary-key order_idHow it works: The merge strategy updates existing rows and inserts new ones based on the primary key. Only rows where updated_at has changed will be processed.
Common Use Cases
Ready-to-use commands for typical X Ads to Snowflake scenarios.
Daily Customer Data Sync
Keep your analytics warehouse updated with the latest customer information every night.
# Add this to your cron job or scheduler
ingestr ingest \
--source-uri 'twitter-ads://key:secret@acc_123?access_token=tok&access_secret=sec' \
--source-table 'public.customers' \
--dest-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
--dest-table 'analytics.customers' \
--incremental-strategy merge \
--incremental-key updated_at \
--primary-key customer_idHistorical Data Migration
One-time migration of all historical records to your data warehouse.
# One-time full table copy
ingestr ingest \
--source-uri 'twitter-ads://key:secret@acc_123?access_token=tok&access_secret=sec' \
--source-table 'public.transactions' \
--dest-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
--dest-table 'warehouse.transactions_historical'Development Environment Sync
Copy production data to your development Snowflake instance (with sensitive data excluded).
# Copy sample data to development
ingestr ingest \
--source-uri 'twitter-ads://key:secret@acc_123?access_token=tok&access_secret=sec' \
--source-table 'public.products' \
--dest-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
--dest-table 'dev.products' \
--limit 1000 # Only copy 1000 rows for testingTroubleshooting Guide
Solutions to common issues when migrating from X Ads to Snowflake.
Connection refused or timeout errors
Check your connection details:
- Verify account identifier includes region (e.g., xy12345.us-east-1)
- Check if warehouse is running and not suspended
- Ensure user has USAGE privilege on warehouse
- Confirm network policies allow your IP address
Authentication failures
Common authentication issues:
- Verify account identifier includes region (e.g., xy12345.us-east-1)
- Check if warehouse is running and not suspended
- Ensure user has USAGE privilege on warehouse
- Confirm network policies allow your IP address
Schema or data type mismatches
Handling data type differences:
- ingestr automatically handles most type conversions
- Snowflake: VARIANT type for semi-structured data
- Snowflake: ARRAY and OBJECT types for complex structures
- Snowflake: Automatic timezone conversion for TIMESTAMP_TZ
Performance issues with large tables
Optimize large data transfers:
- Use incremental loading to process data in chunks
- Run migrations during off-peak hours
- Split very large tables by date ranges using interval parameters
Ready to scale your data pipeline?
You've learned how to migrate data from X Ads to Snowflake with ingestr. For production workloads with monitoring, scheduling, and data quality checks, explore Bruin Cloud.