Reputation: 9580
I have a web service written in ASP.NET MVC with a SQL Server backend. Clients can send large data in calls to the web service, in the order of 10's of megabytes each.
On the server side I want to read the HTTP request stream and write it to VARBINARY field in a SQL table. How can I do this without reading the entire stream into an in-memory array?
The reason I am concerned about memory is that this system needs to scale to 1000's of concurrent clients (at least).
I'm currently using Linq to SQL and it doesn't look like it supports streaming of data, because the Linq Binary type can only be initialized using a byte array.
Is there some other way to do this using Linq to SQL?
If this is not possible using Linq to SQL, how can I do this using ADO.NET or some other approach?
Upvotes: 3
Views: 2086
Reputation: 1320
I believe you can use the .WRITE clause of the UPDATE statement to do chunked data insertion. If you're using LINQ, you'd create two commands from the data context object. First one to create (INSERT) the row that will contain the data. The second one would be an UPDATE statement that uses the .WRITE clause. parameterize the data and offset values. Loop through, executing the UPDATE statement as many times as you have "chunks".
INSERT INTO XrayImages (HeaderId, ImageBytes) VALUES(@headerId, @imageValue)
UPDATE XrayImages SET ImageBytes.WRITE(@imageChunk, NULL, @chunkLength WHERE ImageId = @imageId;
For more information, please check out this MSDN Article: http://msdn.microsoft.com/en-us/library/bb399384.aspx
I'm pretty sure that by the time that your Controller code is called, the entire request has been received (and is in memory). I'm basing this off that the Request.Files collection already knows how many files are there, and their lengths. With multi-part forms, the only way I know how to determine this is by reading the entire request. I don't know of anyway to change this in MVC, so you may have to write a handler that does the actual file upload itself.
There are a lot of variables here that would need to be considered. Depending on the amount of the 1000's of concurrent clients that will be uploading data at once, you'd need multiple GB nics between the db server and web server. You'd also need I/O capabilities on your database server that would be beyond what I've experienced.
I would seriously consider saving the files to a file system based location and having the meta info in the database, especially if you can keep the files on the web server. Otherwise, i think that using SqlServer's FileStream abilities would be more scalable than just a pure table solution.
Hope that helps!
Upvotes: 1
Reputation: 131706
You can stream the data to disk, and then use OPENROWSET to perform a bulk import of the data. For example:
INSERT INTO YOURTABLE(binaryColumnName)
SELECT * FROM
OPENROWSET(BULK N'C:\data.bin', SINGLE_BLOB)
You can use temp file APIs to avoid having to manage the lifetime of the data on disk ... it will be deleted once no longer in use.
Upvotes: 2