5-minute tutorial
Migrate Lightspeed to GCP Dataproc Serverless in 60 Seconds
Learn how to copy your Lightspeed data to GCP Dataproc Serverless with a single command using ingestr - no code required.
What you'll learn
Prerequisites
- Python 3.8 or higher installed
- Lightspeed API credentials via OAuth2 app registration
- Lightspeed Retail or Restaurant subscription
- GCP project
- Service account with Dataproc permissions
Step 1: Install ingestr
Install ingestr in seconds using pip. Choose the method that works best for you:
Recommended: Using uv (fastest)
# Install uv first if you haven't already
pip install uv
# Run ingestr using uvx
uvx ingestrAlternative: Global installation
# Install globally using uv
uv pip install --system ingestr
# Or using standard pip
pip install ingestrVerify installation: Run ingestr --version to confirm it's installed correctly.
Step 2: Your First Migration
Let's copy a table from Lightspeed to GCP Dataproc Serverless. This example shows a complete, working command you can adapt to your needs.
Set up your connections
Lightspeed connection format:
lightspeed://?client_id=<your-client-id>&client_secret=<your-client-secret>&refresh_token=<your-refresh-token>Parameters:
- • client_id: OAuth2 client ID from your Lightspeed app
- • client_secret: OAuth2 client secret from your Lightspeed app
- • refresh_token: OAuth2 refresh token obtained during authorization
GCP Dataproc Serverless connection format:
dataproc-serverless://project_id/region?credentials=/path/to/key.jsonParameters:
Run your first copy
Copy the entire users table from Lightspeed to GCP Dataproc Serverless:
ingestr ingest \
--source-uri 'lightspeed://?client_id=cid_abc123&client_secret=cs_def456&refresh_token=rt_ghi789' \
--source-table 'sales' \
--dest-uri 'dataproc-serverless://project_id/region?credentials=/path/to/key.json' \
--dest-table 'raw.sales'What this does:
- • Connects to your Lightspeed database
- • Reads all data from the specified table
- • Creates the table in GCP Dataproc Serverless if needed
- • Copies all rows to the destination
Command breakdown:
--source-uriYour source database--source-tableTable to copy from--dest-uriYour destination--dest-tableWhere to write data
Step 3: Verify your data
After the migration completes, verify your data was copied correctly:
Check row count in GCP Dataproc Serverless:
-- Run this in GCP Dataproc Serverless
SELECT COUNT(*) as row_count
FROM raw.sales;
-- Check a sample of the data
SELECT *
FROM raw.sales
LIMIT 10;Advanced Patterns
Once you've mastered the basics, use these patterns for production workloads.
Only copy new or updated records since the last sync. Perfect for daily updates.
ingestr ingest \
--source-uri 'lightspeed://?client_id=cid_abc123&client_secret=cs_def456&refresh_token=rt_ghi789' \
--source-table 'public.orders' \
--dest-uri 'dataproc-serverless://project_id/region?credentials=/path/to/key.json' \
--dest-table 'raw.orders' \
--incremental-strategy merge \
--incremental-key updated_at \
--primary-key order_idHow it works: The merge strategy updates existing rows and inserts new ones based on the primary key. Only rows where updated_at has changed will be processed.
Common Use Cases
Ready-to-use commands for typical Lightspeed to GCP Dataproc Serverless scenarios.
Daily Customer Data Sync
Keep your analytics warehouse updated with the latest customer information every night.
# Add this to your cron job or scheduler
ingestr ingest \
--source-uri 'lightspeed://?client_id=cid_abc123&client_secret=cs_def456&refresh_token=rt_ghi789' \
--source-table 'public.customers' \
--dest-uri 'dataproc-serverless://project_id/region?credentials=/path/to/key.json' \
--dest-table 'analytics.customers' \
--incremental-strategy merge \
--incremental-key updated_at \
--primary-key customer_idHistorical Data Migration
One-time migration of all historical records to your data warehouse.
# One-time full table copy
ingestr ingest \
--source-uri 'lightspeed://?client_id=cid_abc123&client_secret=cs_def456&refresh_token=rt_ghi789' \
--source-table 'public.transactions' \
--dest-uri 'dataproc-serverless://project_id/region?credentials=/path/to/key.json' \
--dest-table 'warehouse.transactions_historical'Development Environment Sync
Copy production data to your development GCP Dataproc Serverless instance (with sensitive data excluded).
# Copy sample data to development
ingestr ingest \
--source-uri 'lightspeed://?client_id=cid_abc123&client_secret=cs_def456&refresh_token=rt_ghi789' \
--source-table 'public.products' \
--dest-uri 'dataproc-serverless://project_id/region?credentials=/path/to/key.json' \
--dest-table 'dev.products' \
--limit 1000 # Only copy 1000 rows for testingTroubleshooting Guide
Solutions to common issues when migrating from Lightspeed to GCP Dataproc Serverless.
Connection refused or timeout errors
Check your connection details:
Authentication failures
Common authentication issues:
Schema or data type mismatches
Handling data type differences:
- ingestr automatically handles most type conversions
Performance issues with large tables
Optimize large data transfers:
- Use incremental loading to process data in chunks
- Run migrations during off-peak hours
- Split very large tables by date ranges using interval parameters
Ready to scale your data pipeline?
You've learned how to migrate data from Lightspeed to GCP Dataproc Serverless with ingestr. For production workloads with monitoring, scheduling, and data quality checks, explore Bruin Cloud.