Reputation: 53
I have large JSON data, greater than 2kB, in each record of my table and currently, these are being stored in JSONB field. My tech stack is Django and Postgres. I don't perform any updates/modifications on this json data but i do need to read it, frequently and fast. However, due to the JSON data being larger than 2kB, Postgres splits it into chunks and puts it into the TOAST table, and hence the read process has become very slow. So what are the alternatives? Should i use another database like MongoDB to store these large JSON data fields? Note: I don't want to pull the keys out from this JSON and turn them into columns. This data comes from an API.
Upvotes: 5
Views: 15724
Reputation: 1558
If you just need to store and read fully this json object without using the json structure in your WHERE query, what about simply storing this data as binary in a bytea
column? https://www.postgresql.org/docs/current/datatype-binary.html
Upvotes: 1
Reputation: 1626
It is hard to answer specifically without knowing the details of your situation, but here are some things you may try:
Your mileage may vary with all of the above suggestions, especially as each will depend on your particular use case. (see the questions in my comment)
However, the overall idea is to use the tools that Postgresql provides to make your data quickly accessible. Yes this may involve pulling the data out of its original JSON blob, but this doesn't need to be done manually. Postgresql provides some great tools for this.
Upvotes: 6