Copy data
from Postgres to DuckDB
Ingest data from Postgres into DuckDB with no code required. Extend if needed with custom code.
Database
What is Postgres?
Postgres, also known as PostgreSQL, is a powerful, open-source relational database system with a strong reputation for reliability, feature robustness, and performance.
- Open Source
- Postgres is an open-source database, providing a cost-effective and customizable solution for data management.
- ACID Compliance
- Ensures atomicity, consistency, isolation, and durability for reliable transaction processing.
- Extensibility
- Postgres supports advanced data types, custom functions, and extensions, allowing for a highly extensible database environment.
- Strong Community Support
- Backed by a large and active community, Postgres benefits from continuous improvements and extensive support.
Database
What is DuckDB?
DuckDB is an in-process SQL OLAP database management system designed to support fast analytical queries.
- High Performance
- Optimized for fast analytical queries, providing high performance for OLAP workloads.
- In-Process Database
- Runs within your application process, reducing latency and simplifying deployment.
- Lightweight
- Designed to be lightweight and easy to integrate into various environments.
- SQL Support
- Provides full SQL support, making it easy to query and analyze your data.
Copy data between
Postgres & DuckDB
Bruin Cloud enables you to copy data between any source and destination.
Build data pipelines faster
Built-in connectors, defined with YAML
Bruin is a code-based platform, meaning that everything you do comes from a Git repo, versioned. All of the data ingestions are defined in code, version controlled in your repo.
- Multiple platforms
- Bruin supports quite a few platforms as built-in connectors. You can ingest data from AWS, Azure, GCP, Snowflake, Notion, and more.
- Built on open-source
- Bruin's ingestion engine is built on ingestr, an open-source data ingestion tool.
- Custom sources & destinations
- Bruin supports pure Python executions, enabling you to build your own data ingestion code.
- Incremental loading
- Bruin supports incremental loading, meaning that you can ingest only the new data, not the entire dataset every time.
Build safer
End-to-end quality in raw data
Bruin's built-in data quality capabilities are designed to ensure that the data you ingest is of the highest quality and always matches with your expectations.
- Built-in quality checks
- Bruin supports built-in quality checks, such as not_null, accepted_values, and more, all ready to be used in all assets.
- Custom quality checks
- Bruin allows you to define custom quality checks in SQL, enabling you to define your own quality standards.
- Templating in quality checks
- Bruin supports templating in quality checks, meaning that you can use variables in your checks, and run checks only for incremental periods.
- Automated alerting
- Failing quality checks will automatically send alerts to the configured channels, ensuring that you are always aware of the data quality issues.