Rian Reis
Rian Reis

Reputation: 11

Is there any way to set maximum row number on PostgreSQL database?

The question is pretty simple. Is there any way to dynamically limit the maximum number of rows of a Postgres database?

I didn't thought it was possible, until I saw that Heroku does it. After some research, I found out a way with triggers:

CREATE OR REPLACE FUNCTION check_number_of_row()
RETURNS TRIGGER AS
$body$
BEGIN
    -- replace 100 by the number of rows you want
    IF (SELECT count(*) FROM your_table) > 100
    THEN 
        RAISE EXCEPTION 'INSERT statement exceeding maximum number of rows for 
this table' 
    END IF;
END;
$body$
LANGUAGE plpgsql;

CREATE TRIGGER tr_check_number_of_row 
BEFORE INSERT ON your_table
FOR EACH ROW EXECUTE PROCEDURE check_number_of_row();

But this isn't quite what I need. It limits the rows of a table to a STATIC maximal number, so I would have to loop through all the tables, and it looks like a pretty messy option. And I haven't even thought on a way to set this max number dynamically.

I thought about doing daily/weekly cron job to check all the databases, count the total number of rows and see if it's within the limit, but if there is a more efficient way, I'll take it.

Upvotes: 1

Views: 1670

Answers (2)

Laurenz Albe
Laurenz Albe

Reputation: 247410

Since SELECT count(*) is a very expensive operation, I don't think it is a good idea to run it on every data change. Besides, the row count won't tell you much about the size of the table.

My approach would be to use a trigger using pg_total_relation_size() like this:

CREATE OR REPLACE FUNCTION size_trig() RETURNS trigger
   LANGUAGE plpgsql AS
$$BEGIN
   IF pg_total_relation_size(TG_RELID) > (TG_ARGV[0])::bigint THEN
      RETURN NULL;
   ELSE
      RETURN NEW;
   END IF;
END;$$;

CREATE TRIGGER size_trig BEFORE INSERT ON big
   FOR EACH ROW
   EXECUTE PROCEDURE size_trig(819200);

That would limit the total table size including indexes and TOAST table to 100 blocks, and it is comparatively cheap.

You still have to define the trigger for all tables that you want checked, but there is no way around that short of modifying PostgreSQL.

You see that you can set the limit dynamically for each table. You could also come up with a solution that uses a lookup table for the size limit.

Upvotes: 2

RangerRanger
RangerRanger

Reputation: 2493

I didn't thought it was possible, until I saw that Heroku does it.

This isn't entirely accurate. Database row limits for Heroku's hobby tier aren't hard capped. Put another way, when you exceed 10k or 10M rows in hobby-dev or hobby-basic there is nothing that will stop writes immediately. Heroku gives a 7 day window via email to bring the counts under the quota or they will prevent write access. If your purposes are simply to have a heads up and adjust accordingly, they have a nice system for this already.

More to your question however, considering that Heroku Postgres (especially true for hobby databases) is fairly locked down I don't think you'll have the level of control required to implement what you're after without making it messier than what you have in your question.

Upvotes: 1

Related Questions