arilwan
arilwan

Reputation: 3973

Postgres SELECT * from large table

Working with a Postgres table containing large records, nearly 7m. I understand SELECT * FROM table is large enough to fit in memory, so db connection is lost after a long delay waiting for query result (can only execute SELECT * FROM table LIMIT n).

I need to process each record going through sequentially until the last. What is the way to do this?

Upvotes: 0

Views: 1948

Answers (1)

Jeet Kumar
Jeet Kumar

Reputation: 547

You can get in chunks of n records.

select * from table where tableId>currentTableId and tableId<currentTableId*chunk_size

Explation:

Lets say you have 100 records and your memory contains 10 records at a time.

you query for execute 10 time from your application(any ORM)

 int totalRecords = executeQuery('select count(*) from table');
    int chunkSize = 10;
    int chunks = totalRecords/chunkSize ;
    while(chuks>0){
    executeQuery('select * from table where tableId>currentTableId and tableId<currentTableId*chunkSize ');
chunk-=chunkSize;
    }

This code is in a very abstract way. It's just a hint for you.

Upvotes: 2

Related Questions