5-minute tutorial

Migrate Google Sheets to Snowflake in 60 Seconds

Learn how to copy your Google Sheets data to Snowflake with a single command using ingestr - no code required.

One command Zero code Production ready

What you'll learn

How to install and set up ingestr in seconds
Connect to Google Sheets and Snowflake with proper authentication
Copy entire tables or specific data with a single command
Set up incremental loading for continuous data synchronization

Prerequisites

  • Python 3.8 or higher installed
  • Google account with Sheets API enabled
  • Service account created in Google Cloud
  • Sheet shared with service account email
  • Sheets API enabled in GCP project
  • Snowflake account with active warehouse
  • User credentials with appropriate permissions
  • Database and schema access rights
  • Network policies allowing connections from your IP

Step 1: Install ingestr

Install ingestr in seconds using pip. Choose the method that works best for you:

Recommended: Using uv (fastest)

# Install uv first if you haven't already
pip install uv

# Run ingestr using uvx
uvx ingestr

Alternative: Global installation

# Install globally using uv
uv pip install --system ingestr

# Or using standard pip
pip install ingestr

Verify installation: Run ingestr --version to confirm it's installed correctly.

Step 2: Your First Migration

Let's copy a table from Google Sheets to Snowflake. This example shows a complete, working command you can adapt to your needs.

Set up your connections

Google Sheets connection format:

googlesheets://credentials_path@spreadsheet_id/sheet_name

Parameters:

  • • credentials_path: Service account JSON file
  • • spreadsheet_id: ID from sheet URL
  • • sheet_name: Name of specific sheet tab

Snowflake connection format:

snowflake://user:password@account/database/schema?warehouse=warehouse_name

Parameters:

  • • user: Snowflake username
  • • password: User password
  • • account: Account identifier (including region)
  • • database: Target database name
  • • schema: Schema within the database
  • • warehouse: Compute warehouse to use
  • • role: Optional role to assume

BigQuery Setup Required

Before running the command:

  1. Create a service account in Google Cloud Console
  2. Grant it BigQuery Data Editor and Job User roles
  3. Download the JSON key file
  4. Use the path to this file in your connection string

Snowflake connection format:

snowflake://user:password@account/database/schema?warehouse=warehouse_name

Run your first copy

Copy the entire users table from Google Sheets to Snowflake:

ingestr ingest \
    --source-uri 'googlesheets:///path/to/creds.json@1a2b3c4d5e/Sheet1' \
    --source-table 'Sheet1' \
    --dest-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
    --dest-table 'raw.Sheet1'

What this does:

  • • Connects to your Google Sheets database
  • • Reads all data from the specified table
  • • Creates the table in Snowflake if needed
  • • Copies all rows to the destination

Command breakdown:

  • --source-uri Your source database
  • --source-table Table to copy from
  • --dest-uri Your destination
  • --dest-table Where to write data

Step 3: Verify your data

After the migration completes, verify your data was copied correctly:

Check row count in Snowflake:

-- Run this in Snowflake
SELECT COUNT(*) as row_count 
FROM raw.Sheet1;

-- Check a sample of the data  
SELECT * 
FROM raw.Sheet1 
LIMIT 10;

Advanced Patterns

Once you've mastered the basics, use these patterns for production workloads.

Only copy new or updated records since the last sync. Perfect for daily updates.

ingestr ingest \
    --source-uri 'googlesheets:///path/to/creds.json@1a2b3c4d5e/Sheet1' \
    --source-table 'public.orders' \
    --dest-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
    --dest-table 'raw.orders' \
    --incremental-strategy merge \
    --incremental-key updated_at \
    --primary-key order_id

How it works: The merge strategy updates existing rows and inserts new ones based on the primary key. Only rows where updated_at has changed will be processed.

Common Use Cases

Ready-to-use commands for typical Google Sheets to Snowflake scenarios.

Daily Customer Data Sync

Keep your analytics warehouse updated with the latest customer information every night.

# Add this to your cron job or scheduler
ingestr ingest \
    --source-uri 'googlesheets:///path/to/creds.json@1a2b3c4d5e/Sheet1' \
    --source-table 'public.customers' \
    --dest-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
    --dest-table 'analytics.customers' \
    --incremental-strategy merge \
    --incremental-key updated_at \
    --primary-key customer_id

Historical Data Migration

One-time migration of all historical records to your data warehouse.

# One-time full table copy
ingestr ingest \
    --source-uri 'googlesheets:///path/to/creds.json@1a2b3c4d5e/Sheet1' \
    --source-table 'public.transactions' \
    --dest-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
    --dest-table 'warehouse.transactions_historical'

Development Environment Sync

Copy production data to your development Snowflake instance (with sensitive data excluded).

# Copy sample data to development
ingestr ingest \
    --source-uri 'googlesheets:///path/to/creds.json@1a2b3c4d5e/Sheet1' \
    --source-table 'public.products' \
    --dest-uri 'snowflake://myuser:[email protected]/mydb/public?warehouse=compute_wh' \
    --dest-table 'dev.products' \
    --limit 1000  # Only copy 1000 rows for testing

Troubleshooting Guide

Solutions to common issues when migrating from Google Sheets to Snowflake.

Connection refused or timeout errors

Check your connection details:

  • Share sheet with service account email
  • Enable Google Sheets API in GCP
  • Verify spreadsheet ID from URL
  • Check service account permissions
  • Verify account identifier includes region (e.g., xy12345.us-east-1)
  • Check if warehouse is running and not suspended
  • Ensure user has USAGE privilege on warehouse
  • Confirm network policies allow your IP address
Authentication failures

Common authentication issues:

  • Share sheet with service account email
  • Enable Google Sheets API in GCP
  • Verify spreadsheet ID from URL
  • Check service account permissions
  • Verify account identifier includes region (e.g., xy12345.us-east-1)
  • Check if warehouse is running and not suspended
  • Ensure user has USAGE privilege on warehouse
  • Confirm network policies allow your IP address
Schema or data type mismatches

Handling data type differences:

  • ingestr automatically handles most type conversions
  • Google Sheets: All data is text by default
  • Google Sheets: Date formatting varies by locale
  • Google Sheets: Number formatting affects parsing
  • Google Sheets: Formula cells vs values
  • Snowflake: VARIANT type for semi-structured data
  • Snowflake: ARRAY and OBJECT types for complex structures
  • Snowflake: Automatic timezone conversion for TIMESTAMP_TZ
Performance issues with large tables

Optimize large data transfers:

  • Use incremental loading to process data in chunks
  • Run migrations during off-peak hours
  • Split very large tables by date ranges using interval parameters

Ready to scale your data pipeline?

You've learned how to migrate data from Google Sheets to Snowflake with ingestr. For production workloads with monitoring, scheduling, and data quality checks, explore Bruin Cloud.