Power BI + Bruin
Ingest Power BI data into your warehouse with incremental loading, quality checks, and full lineage. Defined in YAML, version-controlled in Git.
For business teams
What you get
Dashboards powered by clean data
Feed Power BI from Bruin pipelines with built-in quality checks. No more stale or incorrect dashboards.
End-to-end lineage to ${pn}
Trace data from raw sources through every transformation to Power BI. Find the root cause of wrong numbers in seconds.
Scheduled data refresh
Bruin pipelines refresh Power BI data on a cron. Dependencies resolve automatically — Power BI only sees complete data.
Multi-source, one destination
Combine 100+ sources into clean models that power Power BI. Bruin handles the pipeline, Power BI handles the visuals.
For data & engineering teams
How it works
Dependency-aware scheduling
Bruin ensures upstream transforms complete before Power BI gets new data. No more stale or partial dashboards.
YAML-defined, Git-versioned
Every pipeline feeding Power BI is a YAML file. Review in PRs, deploy with CI/CD, roll back with git revert.
Quality gates before visualization
Quality checks run before data reaches Power BI. If checks fail, Power BI keeps showing the last known good data.
End-to-end lineage
Trace data from raw sources through transforms to Power BI dashboards. Find root causes in seconds, not hours.
Before you start
Step 1
Add your Power BI connection
Connect using Azure AD service principal. Add this to your Bruin environment file — credentials are stored securely and referenced by name in your pipeline YAML.
Parameters
client_idAzure AD application client IDclient_secretAzure AD application client secrettenant_idAzure AD tenant identifier
connections:
powerbi:
type: powerbi
uri: "powerbi://client_id:client_secret@tenant_id"Step 2
Create your pipeline
Define a YAML asset that tells Bruin what to pull from Power BI and where to land it. This file lives in your Git repo — reviewable, version-controlled, and deployable with CI/CD.
Available tables
name: raw.powerbi_datasets
type: ingestr
parameters:
source_connection: powerbi
source_table: 'datasets'
destination: bigqueryStep 3
Add quality checks
Add column-level and custom SQL checks to your Power BI data. If a check fails, the pipeline stops — bad data never reaches downstream models or dashboards.
columns:
- name: id
checks:
- name: not_null
- name: unique
custom_checks:
- name: data is fresh
query: |
SELECT MAX(updated_at) >
TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 24 HOUR)
FROM raw.powerbi_datasetsStep 4
Run it
One command. Bruin connects to Power BI, pulls data incrementally, runs your quality checks, and lands clean data in your warehouse. If a check fails, the pipeline stops — bad data never reaches downstream.
--start-date$ bruin run .Running pipeline...
powerbi_datasets
✓ Fetched 2,847 new records
✓ Quality: campaign_id not_null PASSED
✓ Quality: spend not_null PASSED
✓ Quality: no negative ad spend PASSED
✓ Loaded into bigquery
Completed in 12sReady to connect Power BI?
Start for free, or book a demo to see how Bruin handles ingestion, quality, lineage, and scheduling for your entire data stack.