Discussion
Loading...

Post

Log in
  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Noam Ross
Noam Ross
@noamross@ecoevo.social  路  activity timestamp 4 days ago

I have a very large Postgres database (200MB compressed dump, >1.5TB fully materialized on disk) that I want to run large analytical queries on.

Should I
- Connect with DuckDB postgres scanner (https://duckdb.org/2022/09/30/postgres-scanner)?
- Dump everything to parquet?

It has views and materialized views that I think I do want.

#postgres #DuckDB #RStats

  • Copy link
  • Flag this post
  • Block
boB Rudis 馃嚭馃嚘 馃嚞馃嚤 馃嚚馃嚘
boB Rudis 馃嚭馃嚘 馃嚞馃嚤 馃嚚馃嚘
@hrbrmstr@mastodon.social replied  路  activity timestamp 3 days ago

@noamross I'd try both DDB and clickhouse.

Clickhouse's postgres-fu is somewhat magical

  • Copy link
  • Flag this comment
  • Block
inviridi
inviridi
@inviridi@metalhead.club replied  路  activity timestamp 4 days ago

@noamross Is the data ever updated again? If not, I'd go with Parquet.

  • Copy link
  • Flag this comment
  • Block

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About 路 Code of conduct 路 Privacy 路 Users 路 Instances
Bonfire social 路 1.0.2-alpha.7 no JS en
Automatic federation enabled
Log in
  • Explore
  • About
  • Members
  • Code of Conduct