ISOC Pulse + Bruin
Ingest ISOC Pulse data into your warehouse with incremental loading, quality checks, and full lineage. Defined in YAML, version-controlled in Git.
For business teams
What you get
API data, on schedule
ISOC Pulse data lands in your warehouse automatically. No scripts to maintain, no pagination to handle.
Only fetch what changed
Incremental sync means no re-processing. Bruin tracks watermarks so you only get new and updated records.
Catch API changes early
Quality checks validate response data on every sync. Schema changes or missing fields get caught before they break models.
Transform in the same pipeline
Reshape ISOC Pulse API data with SQL or Python. Compute metrics, normalize schemas, and build models — all version-controlled.
For data & engineering teams
How it works
Managed pagination & retries
Bruin handles ISOC Pulse API pagination, rate limiting, and retries. You define the source — Bruin does the rest.
YAML-defined, Git-versioned
Your ISOC Pulse pipeline is a YAML file. Review in PRs, deploy with CI/CD, roll back with git revert.
Incremental with watermarks
Bruin tracks cursor positions and watermarks. Only new and updated ISOC Pulse records get fetched on each run.
Schema validation on responses
Quality checks validate ISOC Pulse API response structure on every sync. Catch breaking API changes early.
Before you start
Step 1
Add your ISOC Pulse connection
No authentication required for public API. Add this to your Bruin environment file — credentials are stored securely and referenced by name in your pipeline YAML.
connections:
isocpulse:
type: isocpulse
uri: "isoc-pulse://"Step 2
Create your pipeline
Define a YAML asset that tells Bruin what to pull from ISOC Pulse and where to land it. This file lives in your Git repo — reviewable, version-controlled, and deployable with CI/CD.
Available tables
name: raw.isocpulse_internet_health
type: ingestr
parameters:
source_connection: isocpulse
source_table: 'internet_health'
destination: bigqueryStep 3
Add quality checks
Add column-level and custom SQL checks to your ISOC Pulse data. If a check fails, the pipeline stops — bad data never reaches downstream models or dashboards.
columns:
- name: id
checks:
- name: not_null
- name: unique
- name: fetched_at
checks:
- name: not_null
custom_checks:
- name: API data is fresh
query: |
SELECT MAX(fetched_at) >
TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 24 HOUR)
FROM raw.isocpulse_internet_healthStep 4
Run it
One command. Bruin connects to ISOC Pulse, pulls data incrementally, runs your quality checks, and lands clean data in your warehouse. If a check fails, the pipeline stops — bad data never reaches downstream.
--start-date$ bruin run .Running pipeline...
isocpulse_internet_health
✓ Fetched 2,847 new records
✓ Quality: campaign_id not_null PASSED
✓ Quality: spend not_null PASSED
✓ Quality: no negative ad spend PASSED
✓ Loaded into bigquery
Completed in 12sOther API integrations
Ready to connect ISOC Pulse?
Start for free, or book a demo to see how Bruin handles ingestion, quality, lineage, and scheduling for your entire data stack.