Create an Isolated Bruin Context
Add a self-contained context/ directory with its own .bruin.yml and pipeline.yml — so the documentation layer for your dbt project never collides with other Bruin pipelines in the same repo.

What you'll do
Inside your dbt repo, create a context/ directory with its own .bruin.yml and pipeline.yml. This is the workspace Bruin will use to describe your warehouse — kept fully isolated from any other Bruin pipelines or configs that might already live in the repo.
Why this step matters
Bruin will happily share a .bruin.yml across an entire repo, but for a documentation-only context layer that's usually the wrong default. A broken connection in a sibling pipeline will cause every bruin command to error, even when you're just trying to import schemas. Scoping the config to a sub-directory keeps the dbt context layer self-contained, easy to delete, and easy to regenerate.
The context layer is its own pipeline, even though it never runs transforms. It's documentation that happens to live in pipeline.yml form so Bruin can validate it.
Instructions
Run all of these from the root of your dbt project (the directory containing dbt_project.yml).
1. Create the context directory
mkdir -p context/assets
You'll end up with context/.bruin.yml, context/pipeline.yml, and a populated context/assets/ after the next step.
2. Write a scoped .bruin.yml
Drop this into context/.bruin.yml. Replace the project ID with your own warehouse's:
# context/.bruin.yml
default_environment: default
environments:
default:
connections:
google_cloud_platform:
- name: contoso_dbt_bq
project_id: bruin-playground-arsalan
location: EU
use_application_default_credentials: true
This connection uses Application Default Credentials — the same gcloud auth application-default login session you already use for dbt. No service account keyfile to rotate, no secret to gitignore, and the AI agent inherits your identity at query time.
Gotcha — wrong field name. The field is
use_application_default_credentials, notuse_default_credentials. The latter is silently ignored and Bruin will look for a keyfile that isn't there.
Gotcha — sibling configs.
bruinwalks up looking for.bruin.yml. If a parent directory has one with a broken connection, every command in this scope will error. Always pass--config-file context/.bruin.ymlso Bruin only loads this file.
For Redshift, ClickHouse, or Postgres, swap the connection block. Examples:
# Postgres
connections:
postgres:
- name: contoso_dbt_pg
host: db.example.internal
port: 5432
username: analyst_ro
password: ${POSTGRES_PASSWORD}
database: contoso
ssl_mode: require
# Redshift
connections:
redshift:
- name: contoso_dbt_rs
host: contoso.abcd1234.eu-west-1.redshift.amazonaws.com
port: 5439
username: analyst_ro
password: ${REDSHIFT_PASSWORD}
database: contoso
# ClickHouse
connections:
clickhouse:
- name: contoso_dbt_ch
host: contoso.eu-central-1.aws.clickhouse.cloud
port: 9440
username: analyst_ro
password: ${CLICKHOUSE_PASSWORD}
database: default
secure: true
3. Write a pipeline.yml
Drop this into context/pipeline.yml:
# context/pipeline.yml
name: contoso_dbt_context
schedule: daily
start_date: "2016-01-01"
default_connections:
google_cloud_platform: "contoso_dbt_bq"
The default_connections block makes the connection name implicit for every asset Bruin generates in the next step — you won't have to repeat connection: contoso_dbt_bq in 40 separate YAMLs. For non-BigQuery warehouses, use the matching key (postgres, redshift, clickhouse).
The pipeline never runs anything — but Bruin still treats it as a pipeline, which gives you
bruin validate, lineage, and docs generation for free.
4. Test the connection
Confirm Bruin can reach the warehouse before going further:
bruin connections ping --config-file context/.bruin.yml contoso_dbt_bq
Expected output:
Successfully connected to 'contoso_dbt_bq'.
If you see an authentication error, run gcloud auth application-default login (BigQuery) or check your env vars (Postgres / Redshift / ClickHouse) and try again.
Gotcha —
bruin connections testloads everything. It walks every connection in scope, so a broken sibling connection breaks the test. Always--config-file-scope when working in a sub-pipeline.
What just happened
You now have a self-contained Bruin pipeline at context/ — config, connection, and pipeline metadata, isolated from anything else in the repo. The next step uses this scoped config to introspect your warehouse and turn every dbt-materialized table into a Bruin asset YAML.