All integrations
Google Ads
+
Bruin

Google Ads + Bruin

Source

Ingest Google Ads data into your warehouse with incremental loading, quality checks, and full lineage. Defined in YAML, version-controlled in Git.

For business teams

What you get

  • Cross-channel ad reporting

    See Google Ads spend alongside Google Ads, Facebook, and every other channel, in one place, updated automatically.

  • True ROAS, not estimated

    Join Google Ads spend with actual revenue from Stripe or your CRM. Know your real return on ad spend, not what the ad platform tells you.

  • No more manual exports

    Stop downloading CSVs from Google Ads. Stakeholders get fresh data every morning without asking anyone.

  • Catch budget anomalies early

    Quality checks flag unexpected spend spikes or zero-impression campaigns before they burn budget.

For data & engineering teams

How it works

  • Incremental sync with lookback

    Bruin handles Google Ads's attribution windows automatically. Set lookback days in the connection URI, no custom logic needed.

  • YAML-defined, Git-versioned

    Your Google Ads pipeline is a YAML file. Review in PRs, deploy with CI/CD, roll back with git revert.

  • Column-level quality checks

    Validate spend, impressions, and clicks with not_null, unique, and custom SQL checks. Pipeline stops on failure.

  • Multi-destination support

    Land Google Ads data in BigQuery, Snowflake, Redshift, or DuckDB. Switch destinations by changing one line.

Before you start

Google Ads API access
Developer token
OAuth2 setup

Step 1

Add your Google Ads connection

Connect using Google Ads API credentials with OAuth2. Add this to your Bruin environment file, credentials are stored securely and referenced by name in your pipeline YAML.

Parameters

  • client_idGoogle Ads API client ID
  • client_secretGoogle Ads API client secret
  • refresh_tokenOAuth2 refresh token
  • customer_idGoogle Ads customer ID (without dashes)
  • developer_tokenGoogle Ads API developer token
connections:
  googleads:
    type: googleads
    uri: "google-ads://?client_id=<client_id>&client_secret=<client_secret>&refresh_token=<refresh_token>&customer_id=<customer_id>&developer_token=<developer_token>"

Step 2

Create your pipeline

Define a YAML asset that tells Bruin what to pull from Google Ads and where to land it. This file lives in your Git repo, reviewable, version-controlled, and deployable with CI/CD.

Available tables

campaignsad_groupsadskeywordsperformance_reports
name: raw.googleads_campaigns
type: ingestr

parameters:
  source_connection: googleads
  source_table: 'campaigns'
  destination: bigquery

Step 3

Add quality checks

Add column-level and custom SQL checks to your Google Ads data. If a check fails, the pipeline stops, bad data never reaches downstream models or dashboards.

Catch negative ad spend before it reaches reports
Validate impressions >= clicks on every sync
Flag campaigns with missing IDs or null spend
columns:
  - name: campaign_id
    checks:
      - name: not_null
  - name: spend
    checks:
      - name: not_null
  - name: impressions
    checks:
      - name: not_null

custom_checks:
  - name: no negative ad spend
    query: |
      SELECT COUNT(*) = 0
      FROM raw.googleads_campaigns
      WHERE spend < 0

Step 4

Run it

One command. Bruin connects to Google Ads, pulls data incrementally, runs your quality checks, and lands clean data in your warehouse. If a check fails, the pipeline stops, bad data never reaches downstream.

Backfill historical data with --start-date
Schedule with cron or trigger from CI/CD
Full lineage from Google Ads to your dashboards
$ bruin run .
Running pipeline...

  googleads_campaigns
    ✓ Fetched 2,847 new records
    ✓ Quality: campaign_id not_null     PASSED
    ✓ Quality: spend not_null           PASSED
    ✓ Quality: no negative ad spend     PASSED
    ✓ Loaded into bigquery

  Completed in 12s

Other Ad Platform integrations

Ready to connect Google Ads?

Start for free, or book a demo to see how Bruin handles ingestion, quality, lineage, and scheduling for your entire data stack.