Reputation: 61
i am using postgres db. i have table with 100 million records. that table have 52 columns including 1 or 2 unique column. and i have the application it will analysis the table and it give the result how much total distinct values are there in that table.
and also it will give how much pattern, datatype...etc but when i use that table in that application it taking 5days to give result.because in backend it will execute 100 query in that db. specially one query taking 3days.
how to improve the performance of the query to fetch the result.
i tried index but it also taking same time (5days).
can i create index if the table having records? if yes means which type of index i need to create.
how to improve the performance of the query index or partition or anything else.
if i need to create index/partition means which type of index/partition i need to create?
Upvotes: 0
Views: 318
Reputation: 2091
The pg_stats view will do this for you.
select *
from pg_stats
where tablename = 'stable'
gives something like this.
Name |Value
----------------------+---------------------------------------------------------
schemaname |public
tablename |atable
attname |a
inherited |false
null_frac |0.0
avg_width |4
n_distinct |-1.0
most_common_vals |
most_common_freqs |NULL
histogram_bounds | {1,98,196,495,593,691,789,887,985,1083,1181,1279,1377,147
correlation |1.0
most_common_elems |
most_common_elem_freqs|NULL
elem_count_histogram |NULL
Documentation https://www.postgresql.org/docs/current/view-pg-stats.html
Upvotes: 1