Step 2
Beginner
5 min

Connect Your Data

Add your data warehouse as a connection so Bruin can reach your tables and the AI agent can query them.

Bruin CLIBigQueryRedshiftClickHousePostgres
Learning paths:Data AnalystData Engineer

Why this step matters

A Bruin project on its own doesn't know where your data lives. Connections are what bridge the gap — they tell Bruin (and by extension, your AI agent) how to reach your warehouse, with what credentials, and under what constraints.

Once a connection is configured, every downstream step — importing metadata, running quality checks, querying data — uses it automatically. You set it up once and everything else just works.

What you'll do

Add a connection to your data warehouse using the Bruin CLI, then verify it works with a quick test command.

Choose your warehouse

Pick the tab that matches your setup. The overall flow is the same: add the connection, give it a name, and test it.

BigQuery

This tutorial uses Application Default Credentials (your personal Google account). If your team uses a service account JSON key file instead, see the BigQuery platform docs.

1. Authenticate with Google Cloud

gcloud auth application-default login

This opens a browser for Google sign-in. Once complete, a credential token is saved locally that Bruin will use automatically.

2. Add the connection

bruin connections add

The interactive wizard will walk you through selecting the connection type (google_cloud_platform), giving it a name (use gcp-default to match this tutorial), and entering your GCP project ID. You can find the project ID in the Cloud Console or by running gcloud config get-value project.

3. Test the connection

bruin connections test --name gcp-default

If the test fails, see Troubleshooting below.

Troubleshooting

If bruin connections test fails, here are the most common causes:

BigQuery

  • "Could not find default credentials" — Run gcloud auth application-default login again. Tokens expire after a period of inactivity.
  • "Project not found" — The project ID doesn't match an existing GCP project. Verify it in the Cloud Console.
  • "BigQuery API has not been enabled" — Enable it in APIs & Services.
  • "Access denied" — Your account needs at least BigQuery Data Viewer and BigQuery Job User roles. Ask your GCP admin.

Redshift

  • Connection timeout — Your cluster's VPC security group must allow inbound traffic on port 5439 from your IP.
  • Authentication failure — Verify the username/password and that the user has SELECT permission on target schemas.

ClickHouse

  • Connection refused — For ClickHouse Cloud, use port 9440 (TLS). For self-hosted, check that the port is open and accessible.
  • Authentication failure — Verify the username has SELECT grants on the target database.

Postgres

  • Connection refused — Check that your database server allows connections from your IP (firewall / allowlist).
  • SSL errors — For hosted services (Supabase, Neon), Bruin handles SSL automatically in most cases. If it doesn't, check the Postgres platform docs.

What just happened

Your .bruin.yml file now contains an encrypted connection entry. Bruin stores credentials locally — nothing leaves your machine. From here on, any Bruin command (import, query, run) can reach your warehouse using the connection name you just configured.