I have a very large Postgres database (200MB compressed dump, >1.5TB fully materialized on disk) that I want to run large analytical queries on.
Should I
- Connect with DuckDB postgres scanner (https://duckdb.org/2022/09/30/postgres-scanner)?
- Dump everything to parquet?
It has views and materialized views that I think I do want.