zhouhufeng
zhouhufeng

Reputation: 76

huge postgres database size reduction

I have a huge database (9 billon rows, 3000 columns) which is currently host on postgres, it is huge about 30 TB in size. I am wondering if there is some practical ways to reduce the size of the database while preserving the same information in the database, as storage is really costly.

Upvotes: 0

Views: 1085

Answers (1)

Gurmokh
Gurmokh

Reputation: 2091

If you don't want to delete any data.

  • Vacuuming. Depending on how many updates/deletes your database performs there will be much garbage. A database that size is likely full of tables that do not cross vacuum thresholds often (pg13 has a fix for this). Manually running vacuums will allocate dead rows for re-use, and free up space where ends of pages are no longer needed.

  • Index management. These bloat over time and should be smaller than your tables. Re-indexing (concurrently) will give you some space back, or allow re-use of existing pages.

  • data de-duplication / normalisation. See where you can remove data from tables where it is not needed, or presented elsewhere in the database.

Upvotes: 1

Related Questions