Formerly Crunchy Data, Microsoft, Citus Data, AWS, TCD, VU
Speed up Postgres analytical queries 100x with 2 commands.
Speed up Postgres analytical queries 100x with 2 commands.
Using pg_parquet you can trivially export data to S3, and using Crunchy Data Warehouse you can just as easily query or import Parquet files from PostgreSQL.
Using pg_parquet you can trivially export data to S3, and using Crunchy Data Warehouse you can just as easily query or import Parquet files from PostgreSQL.
There's raw events table and a summary table containing view counts.
You then define a pipeline using an insert..select command, and keeps running that to do fast, reliable, incremental processing in the background.
There's raw events table and a summary table containing view counts.
You then define a pipeline using an insert..select command, and keeps running that to do fast, reliable, incremental processing in the background.
Still, I think this is what getting started with Iceberg should look like:
Still, I think this is what getting started with Iceberg should look like:
But you really start noticing the difference when you run through several steps.
But you really start noticing the difference when you run through several steps.
Create an Iceberg table from a file, load some more data, and run fast queries.
Create an Iceberg table from a file, load some more data, and run fast queries.