major4x
major4x

Reputation: 442

PostgreSQL Array Insert Size Limitation Error

I get the following Postgres error:

ERROR: value too long for type character varying(1024)

The offending statement is:

INSERT INTO integer_array_sensor_data (sensor_id, "time", status, "value") 
VALUES (113, 86651204, 0, '{7302225, 7302161, 7302593, 730211, 
        ... <total of 500 values>... 7301799, 7301896}');

The table:

CREATE TABLE integer_array_sensor_data (
    id SERIAL PRIMARY KEY, 
    sensor_id INTEGER NOT NULL, 
    "time" INTEGER NOT NULL, 
    status INTEGER NOT NULL, 
    "value" INTEGER[] NOT NULL, 
    server_time TIMESTAMP WITHOUT TIME ZONE NOT NULL DEFAULT NOW()
); 

Researching PostgreSQL documentation doesn't mention anything about limitation on the array size.

Any idea how to fix this?

Upvotes: 1

Views: 182

Answers (2)

major4x
major4x

Reputation: 442

I start understanding my question although I have not found a solution. Problem is that there is a limitation on the string somewhere library level. I am actually using pqxx and you can't have strings longer than 1024 chars. I have accepted the answer of Guillaume F. because he figured this out but the casting doesn't work. I will edit my reply once I find a solution so people know what to do.

I just tried prepared statements and they have the same limitation.

The workaround is to use COPY or its pqxx binding pqxx:tablewriter.

Upvotes: 0

Guillaume F.
Guillaume F.

Reputation: 6463

The problem doesn't come from the Array itself, but from the Varchar String declaring the array values in your Insert. Some drivers type the string literals as varchar(1024) causing that issue.

Instead of

'{1,2,3}'

try using

ARRAY[1,2,3]

otherwise you can try declaring the type of your string as TEXT (unlimited)

'{1,2,3}'::text

Upvotes: 3

Related Questions