arilwan
arilwan

Reputation: 3973

Reading rows from very large table

I want to read through all rows from a very large table. However executing SELECT *doesn't work, possibly because the table is large row=1 1 to rows=117051923 so when I executed, it keeps running with no result (eventually connection to postgres breaks).

supplier=> SELECT * FROM my_table;
Killed
$Akil

I wouldn't like to use LIMIT/OFFSET because I have to keep setting range of rows to read (image over 117M rows). is there a way to have query run some rows (say in chuck). Purely sql, not implemented in a language.

Upvotes: 1

Views: 1331

Answers (2)

Jeremy
Jeremy

Reputation: 6723

If this is in psql, you can adjust how many rows are returned at once by adjusting FETCH_COUNT. This should prevent you from running out of memory.

\set FETCH_COUNT 10000

Alternatively, use \copy to copy the data into a file and use your favorite pager (less or more) to go through it.

\copy my_table TO 'my_table.txt'

Upvotes: 1

CodyJHanson
CodyJHanson

Reputation: 136

In postgres, the best way to do this would be to use a cursor: https://www.postgresql.org/docs/current/plpgsql-cursors.html

Example:

CREATE OR REPLACE FUNCTION get_film_titles(p_year INTEGER)
   RETURNS text AS $$
DECLARE 
 titles TEXT DEFAULT '';
 rec_film   RECORD;
 cur_films CURSOR(p_year INTEGER) 
 FOR SELECT title, release_year
 FROM film
 WHERE release_year = p_year;
BEGIN
   -- Open the cursor
   OPEN cur_films(p_year);

   LOOP
    -- fetch row into the film
      FETCH cur_films INTO rec_film;
    -- exit when no more row to fetch
      EXIT WHEN NOT FOUND;

    -- build the output
      IF rec_film.title LIKE '%ful%' THEN 
         titles := titles || ',' || rec_film.title || ':' || rec_film.release_year;
      END IF;
   END LOOP;

   -- Close the cursor
   CLOSE cur_films;

   RETURN titles;
END; $$

LANGUAGE plpgsql;

Upvotes: 0

Related Questions