Vladimir
Vladimir

Reputation: 53

How can I update binary blob field with large file (1Gb) in firebird db using .NET data provider?

I want to upload a large file of size 1Gb to my firebird database. I keep getting a 'System.OutOfMemoryException' thrown at the code below.

FbTransaction trans = conn.BeginTransaction();
string updateSQL = string.Format( @"UPDATE {0} SET {1} = :DATA WHERE {2} RETURNING DATA", tableName, colName, whereStr );
using( FbCommand getBlobCmd = new FbCommand( updateSQL, conn ) )
{
  try
  {
    getBlobCmd.Transaction = trans;
    FbParameter parBlob = new FbParameter( "DATA", FbDbType.Binary );
    parBlob.Direction = ParameterDirection.Output;
    parBlob.Value = File.ReadAllBytes(filePath); //System.OutOfMemoryException
    getBlobCmd.Parameters.Add( parBlob );
    getBlobCmd.ExecuteNonQuery();
    trans.Commit();
  }
  catch
  {
    if( trans != null )
      trans.Rollback();
  }
}

I understand that I need to write data in chunks. But there's no class in .NET data provider with similar functionality. What I need to use in this situation? Thanks!

Upvotes: 3

Views: 1855

Answers (3)

Dmitry Kovalenko
Dmitry Kovalenko

Reputation: 1

You can write and read the BLOB data (text and binary) with a size more than 1GB :)

"IBProvider"+"LCPI .NET Data Provider" have the native support for System.IO.Stream and System.IO.TextReader for avoiding any limitations of physical memory.

[Usage of System.IO.Stream] http://www.ibprovider.com/eng/examples/lcpi_oledb_net__c001__example_0001.html

[Sample of work with System.IO.FileStream][Russian text] http://firebirder.ru/firebird_interbase__and__system_io_stream__system_io_textreader

Look to other similar samples on IBProvider web site and find other advanced technologies for work with Firebird BLOB :)

Regards.

Upvotes: 0

Vladimir
Vladimir

Reputation: 53

I solved the problem by adding method in BlobBase.cs:

    public void Write( Stream stream )
    { 
        try
        {
            this.Create();

            byte[] tmpBuffer = null;

            int length = (int)stream.Length;
            int offset  = 0;
            int chunk   = length >= this.segmentSize ? this.segmentSize : length;

            tmpBuffer = new byte[chunk];

            while( length > 0 )
            {
                if( chunk > length )
                {
                    chunk = ( int )length;
                    tmpBuffer = new byte[ chunk ];
                }

                stream.Read( tmpBuffer, 0, chunk );
                this.PutSegment( tmpBuffer );

                offset += chunk;
                length -= chunk;
            }

            this.Close();
        }
        catch (Exception)
        {
            // Cancel the blob and rethrow the exception
            this.Cancel();

            throw;
        }
    }

And FbCommand.cs:

BlobBase blob = this.statement.CreateBlob();
if( this.Parameters[ index ].Value is System.IO.Stream )
{
  blob.Write( (System.IO.Stream)this.Parameters[ index ].Value );
}
else
{
   blob.Write( (byte[])this.Parameters[ index ].Value );
}

And final using code snippet:

string updateSQL = string.Format( @"UPDATE {0} SET {1} = :DATA WHERE {2}",    tableName, colName, whereStr );

using( FbCommand getBlobCmd = new FbCommand( updateSQL, conn ) )
{
  try
  {
    FbParameter parBlob = new FbParameter( "DATA", FbDbType.Binary );
    parBlob.Direction = ParameterDirection.Input;
    parBlob.Value = blobFile.InnerStream;//input FileStream
    getBlobCmd.Parameters.Add( parBlob );
    getBlobCmd.Prepare();
    getBlobCmd.ExecuteNonQuery();
  }
  catch{}
}

Thanks, for help!

Upvotes: 0

cincura.net
cincura.net

Reputation: 4150

Currently you have basically only one option (suppose you want to go via .NET provider). Switch your app to x64 (if you haven't done that already) and buy enough RAM (or make big swap file). Not a best solution, I know.

Also you can give some love to http://tracker.firebirdsql.org/browse/DNET-279. Then you can upload basically size-unlimited file.

Upvotes: 1

Related Questions