Reputation: 646
In my Java App im writing binary data of type lo (large object) using the LargeObject Interface:
LargeObjectManager lLOManager = ((org.postgresql.PGConnection) aDatabaseConnection).getLargeObjectAPI();
long oid = lLOManager.createLO();
LargeObject lo = lLOManager.open(oid, LargeObjectManager.WRITE);
if (aBLOB != null)
lo.write(aBLOB);
lo.close();
When reading the table i use the same api to get an inputstream:
LargeObjectManager lLOManager = ((org.postgresql.PGConnection) aDatabaseConnection).getLargeObjectAPI();
long lOid = aRs.getLong(aIndex);
if (lOid != 0)
{
LargeObject lObj = lLOManager.open(lOid, LargeObjectManager.READ);
inputStream = lObj.getInputStream();
lObj.close();
}
However, when i then try to read the stream i get an exception:
int number = inputStream.read()
or alternatively
byte[] byteArray = new byte[1024];
int number = inputStream.read(byteArray);
Exception:
org.postgresql.util.PSQLException: ERROR: invalid large-object descriptor: 0
i summed up the code here, because in my app it is distributed. The point is that currently the content from the inputstream is read, when the db connection the stream is coming from is already closed. However, the code used to work some time ago for postgres and still does for oracle. I'm tried using postgres driver 9.3 as well as 42.1.4 together with Postgres 9.2.
My Question: Is a java inputstream derived from a sql resultset independent from the status of the db connection and resultset?
Upvotes: 2
Views: 4973
Reputation: 246092
You cannot read from the large object after you have closed it.
Also, opening and reading a large object must happen in one database transaction. The large object is closed at the end of the transaction.
Upvotes: 6