Rubens Farias
Rubens Farias

Reputation: 57976

Large file upload into WSS v3

I'd built an WSSv3 application which upload files in small chunks; when every data piece arrives, I temporarly keep it into a SQL 2005 image data type field for performance reasons**.

Problem come when upload ends; I need to move data from my SQL Server to Sharepoint Document Library through WSSv3 object model.

Right now, I can think two approaches:

SPFileCollection.Add(string, (byte[])reader[0]); // OutOfMemoryException

and

SPFile file = folder.Files.Add("filename", new byte[]{ });
using(Stream stream = file.OpenBinaryStream())
{
    // ... init vars and stuff ...
    while ((bytes = reader.GetBytes(0, offset, buffer, 0, BUFFER_SIZE)) > 0)
    {
        stream.Write(buffer, 0, (int)bytes); // Timeout issues
    }
    file.SaveBinary(stream);
}

Are there any other way to complete successfully this task?

** Performance reasons: if you tries to write every chunk directly at Sharepoint, you'll note a performance degradation as file grows up (>100Mb).

Upvotes: 1

Views: 1713

Answers (3)

Rubens Farias
Rubens Farias

Reputation: 57976

I ended with following code:


myFolder.Files.Add("filename", 
   new DataRecordStream(dataReader, 
      dataReader.GetOrdinal("Content"), length));

You can find DataRecordStream implementation here. It's basically a Stream whos read data from a DbDataRecord through .GetBytes

This approach is similar to OpenBinaryStream()/SaveBinary(stream), but it's doesnt keeps all byte[] in memory while you transfer data. In some point, DataRecordStream will be accessed from Microsoft.SharePoint.SPFile.CloneStreamToSPFileStream using 64k chunks.

Thank you all for valuable infos!

Upvotes: 1

Brett Coburn
Brett Coburn

Reputation: 414

As mentioned previously, storing large files in Sharepoint is generally a bad idea. See this article for more information: http://blogs.msdn.com/joelo/archive/2007/11/08/what-not-to-store-in-sharepoint.aspx

With that said, it is possible to use external storage for BLOBs, which may or may not help your performance issues -- Microsoft released a half-complete external BLOB storage provider that does the trick, but it unfortunately works at the farm level and affects all uploads. Ick.

Fortunately, since you can implement your own external BLOB provider, you may be able to write something to better handle these specific files. See this article for details: http://207.46.16.252/en-us/magazine/2009.06.insidesharepoint.aspx

Whether or not this would be worth the overhead depends on how much of a problem you're having. :)

Upvotes: 0

Alex Angas
Alex Angas

Reputation: 60058

The first thing I would say is that SharePoint is really, really not designed for this. It stores all files in its own database so that's where these large files are going. This is not a good idea for lots of reasons: scalability, cost, backup/restore, performance, etc... So I strongly recommend using file shares instead.

You can increase the timeout of the web request by changing the executionTimeout attribute of the httpRuntime element in web.config.

Apart from that, I'm not sure what else to suggest. I haven't heard of such large files being stored in SharePoint. If you absolutely must do this, try also asking on Server Fault.

Upvotes: 0

Related Questions