PostHog + Bruin
Ingest PostHog data into your warehouse with incremental loading, quality checks, and full lineage. Defined in YAML, version-controlled in Git.
For business teams
What you get
Analysis beyond built-in reports
Join PostHog behavioral data with revenue, support, and CRM data. Answer questions PostHog alone can't.
Trusted behavioral data
Quality checks catch tracking gaps, duplicate events, and missing timestamps before they corrupt your models.
Self-serve for analysts
PostHog data lands in your warehouse where analysts already work. No more exporting, no more waiting.
Real user journeys
Combine PostHog events with purchase and support data to see the full customer journey, not just the product funnel.
For data & engineering teams
How it works
Event schema validation
Check for null event IDs, missing timestamps, and duplicate events on every sync. Catch tracking issues at ingestion.
YAML-defined, Git-versioned
Your PostHog pipeline is a YAML file. Review in PRs, deploy with CI/CD, roll back with git revert.
SQL + Python transforms
Transform raw PostHog events into funnels, cohorts, and user journeys with SQL or Python — in the same pipeline.
Dependency-aware scheduling
Bruin resolves pipeline dependencies automatically. Transforms only run after PostHog data has landed.
Before you start
Step 1
Add your PostHog connection
Connect using PostHog API key. Add this to your Bruin environment file — credentials are stored securely and referenced by name in your pipeline YAML.
Parameters
api_keyPostHog personal API keyhostPostHog instance URL (cloud or self-hosted)project_idPostHog project identifier
connections:
posthog:
type: posthog
uri: "posthog://[email protected]/project_id"Step 2
Create your pipeline
Define a YAML asset that tells Bruin what to pull from PostHog and where to land it. This file lives in your Git repo — reviewable, version-controlled, and deployable with CI/CD.
Available tables
name: raw.posthog_events
type: ingestr
parameters:
source_connection: posthog
source_table: 'events'
destination: bigqueryStep 3
Add quality checks
Add column-level and custom SQL checks to your PostHog data. If a check fails, the pipeline stops — bad data never reaches downstream models or dashboards.
columns:
- name: event_id
checks:
- name: not_null
- name: unique
- name: event_timestamp
checks:
- name: not_null
custom_checks:
- name: data is fresh
query: |
SELECT MAX(event_timestamp) >
TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 24 HOUR)
FROM raw.posthog_eventsStep 4
Run it
One command. Bruin connects to PostHog, pulls data incrementally, runs your quality checks, and lands clean data in your warehouse. If a check fails, the pipeline stops — bad data never reaches downstream.
--start-date$ bruin run .Running pipeline...
posthog_events
✓ Fetched 2,847 new records
✓ Quality: campaign_id not_null PASSED
✓ Quality: spend not_null PASSED
✓ Quality: no negative ad spend PASSED
✓ Loaded into bigquery
Completed in 12sOther Analytics integrations
Ready to connect PostHog?
Start for free, or book a demo to see how Bruin handles ingestion, quality, lineage, and scheduling for your entire data stack.