Bruin Academy

Tutorial module

dbt + Bruin AI Data Analyst

Layer Bruin context on top of an existing dbt + warehouse setup so an AI agent can navigate your models, understand your columns, and run real SQL — without touching the dbt project itself.

Bruin context layer over an existing dbt pipeline

What

  • Add a self-contained Bruin "context layer" alongside an existing dbt project
  • Generate one asset YAML per materialized table — staging, marts, and raw — straight from the warehouse
  • Give an AI agent the ability to navigate your dbt outputs and run SQL with confidence

How

  • bruin import database introspects the warehouse and creates skeleton asset files
  • bruin ai enhance fills in descriptions, tags, and quality checks per table and column
  • bruin query + Bruin MCP let agents read context and execute SQL using your existing credentials

What

  • Add a self-contained Bruin "context layer" alongside an existing dbt project
  • Generate one asset YAML per materialized table — staging, marts, and raw — straight from the warehouse
  • Give an AI agent the ability to navigate your dbt outputs and run SQL against them with confidence

How

  • bruin import database introspects the warehouse and creates skeleton asset files
  • bruin ai enhance fills in descriptions, tags, and quality checks per table and column
  • bruin query + Bruin MCP let agents read context and execute SQL using your existing credentials
  • Reference implementation: contoso-dbt

The dbt half of the stack is treated as a prerequisite — this module is about the context layer, the piece that turns an already-running dbt + warehouse setup into something an AI agent can use without hallucinating.

Before you start

  • A working dbt project with models materialized in a warehouse (BigQuery, Redshift, ClickHouse, or Postgres)
  • Bruin CLI installed (curl -LsSf https://getbruin.com/install/cli | sh)
  • An AI coding tool installed (Claude Code, Cursor, or Codex)
  • Read access to your warehouse (BigQuery Data Viewer, Postgres/Redshift SELECT, etc.)
  1. 1Start From an Existing dbt Pipeline4 min
  2. 2Create an Isolated Bruin Context6 min
  3. 3Import Your dbt Schemas as Assets5 min
  4. 4AI-Enhance and Validate the Context6 min
  5. 5Wire Up the AI Agent7 min

Get help & contribute