Reputation: 1105
Actually, I've asked this question before but I deleted the question because probably I've asked it wrong or without correctly expressing my question or aim.
I'm connecting to our ERP Software database (FireBird v2.1) to retrieve data for exporting as XML. But at the same time I need to save all product images into separate folder (I'll sum up all data and images and upload to web server to importing for e-commerce app).
Our ERP Software Company divided the database into 2 separate parts (1 for information, 1 for files(images, files etc.) - FILES Database)
The problem is that, all assets in FILES database are zipped with zlib so I cannot stream these images via LiveBindings directly (eval error in LinkPropertyToFieldBitmap: Loading bitmap failed) I know it is normal because files are zipped and inserted as BLOB.
I need to get these zlib compressed data as streams to use it as input for decompression procedure.
I am planning to use below procedure for saving decompressed images as a file.
procedure TForm1.DecompressStream(Stream: TStream);
var
LOutput: TFileStream;
LUnZip: TZDecompressionStream;
begin
Stream := TStream.Create();
{ Create the Output and Decompressed streams. }
LOutput := TFileStream.Create('SKU OF PRODUCT.jpg', fmCreate);
LUnZip := TZDecompressionStream.Create(Stream);
{ Decompress data. }
LOutput.CopyFrom(LUnZip, 0);
{ Free the streams. }
Stream.Free;
LUnZip.Free;
LOutput.Free;
end;
NOTE: Maybe the above procedure is not the right one but after be able to fetch zlib data as stream I can debug for the correcting it. Thanks..
UPDATE : I am using LiveBindings for fetching data from database but using LiveBindings is not mandatory for zipped data and image processing.
Upvotes: 2
Views: 936
Reputation: 7912
You've said that you'll be using that dataset only for reading (with no writing back to the DBMS) and that your aim is actually only uncompressing BLOB streams fetched to the client. There is currently no way to intercept BLOB fetching in some comfortable way (kind of OnBlobFetching event).
Nearest path for intercepting BLOB stream storing would be in the TFDDatSRow.InternalSetData method (it's the ideal place for transforming fetched data, right before they get stored in FireDAC's internal data storage). But that would require source code modification.
Without source code modifications you can write e.g. event handler for the AfterGetRecord event and uncompress the stream from there. Just be very careful to not commit modified data changes into the database if you decide to overwrite fetched streams directly in fields (ideally set dataset to read-only mode).
An example:
procedure TForm1.FDQuery1AfterGetRecord(DataSet: TFDDataSet);
var
BlobStream: TFDBlobStream;
HelpStream: TMemoryStream;
begin
{ create BLOB stream for reading and writing }
BlobStream := DataSet.CreateBlobStream(DataSet.FieldByName('Data'), bmReadWrite) as TFDBlobStream;
try
{ create intermediate stream }
HelpStream := TMemoryStream.Create;
try
{ decompress BLOB stream into helper one }
ZDecompressStream(BlobStream, HelpStream);
{ and overwrite the original BLOB stream content with uncompressed data; note, that
TFDBlobStream must know about the modification, otherwise it won't store the data
into the storage when this stream is released (LoadFromStream won't work here) }
BlobStream.Clear;
BlobStream.Write(HelpStream.Memory^, HelpStream.Size);
finally
HelpStream.Free;
end;
finally
BlobStream.Free;
end;
end;
Or similarly on lower level:
procedure TForm1.FDQuery1AfterGetRecord(DataSet: TFDDataSet);
var
DataRow: TFDDatSRow;
DataCol: TFDDatSColumn;
InLength: LongWord;
InBuffer: Pointer;
OutLength: Integer;
OutBuffer: Pointer;
begin
{ get the current row storage object }
DataRow := DataSet.GetRow;
{ for column indexing in following calls find column by name }
DataCol := DataSet.Table.Columns.ColumnByName('Data');
{ try to get pointer to the raw data buffer for the column with given index }
if DataRow.GetData(DataCol.Index, rvDefault, InBuffer, 0, InLength, False) then
begin
{ decompress the data buffer into another allocated by this procedure }
ZDecompress(InBuffer, InLength, OutBuffer, OutLength);
try
{ start editing this storage row }
DataRow.BeginEdit;
try
{ let the storage copy the decompressed data from the buffer }
DataRow.SetData(DataCol.Index, OutBuffer, OutLength);
finally
{ finish this storage row editing without creating new row version, so the engine
won't take the data modification as update }
DataRow.EndEdit(True);
end;
finally
{ and release the buffer allocated by ZLib library function call, input buffer used
here belongs to FireDAC's storage }
FreeMem(OutBuffer);
end;
end;
end;
Upvotes: 3