pj_a
pj_a

Reputation: 21

File Lock for interupted upload doesn't seem to release

I am getting a "The process cannot access the file '' because it is being used by another process." error in our upload handler. What we are testing is an uploader through silverlight that sends data in chunks to a handler on the server. Everything works fine until we test disrupting the internet connection and then re-enabling the internet connection (the uploader is meant to automatically resume when the internet comes back on). When the handler tries to re-open the file after the internet connection comes back up the error ("The process cannot access the file") shows up.

the code that is having issue is below

using (FileStream fs = File.Open(context.Server.MapPath("~/Uploads/") + uploadGuidAsString, FileMode.CreateNew, FileAccess.Write, FileShare.None))
{

    SaveFile(context.Request.InputStream, fs);
    fs.Flush();

}

Upvotes: 2

Views: 302

Answers (2)

Martin James
Martin James

Reputation: 24897

You need an ID that can uniquely identify the upload session and a protocol that can allow an upload resume, (GUID?). On reconnection, the uploader can then send in the ID and at what offset in the file it wishes to resume at. The server can then use the ID to lookup the class instance that is handling that upload and, like Dark Falcon suggests, just close the 'old' connection, move the file pointer as requested and resume streaming/chunking on the new connection. I guess you will need some timeout to remove stale upload session objects at the server. Your protocol may have to deal with this - what happens if a client requests a resume after the upload session instance has been timed out and freed?

Rgds, Martin

Upvotes: 0

Dark Falcon
Dark Falcon

Reputation: 44201

TCP sockets can only detect a broken connection when data is actually being sent over the connection. When your uploader's connection goes down, no more data is sent from the far end. Since your server is not sending any data to the uploader, the server simply waits, expecting more data to eventually arrive from the uploader client.

I would recommend that you give each uploader a unique ID. If you see a second connection attempt from an uploader, manually terminate the first connection.

Upvotes: 1

Related Questions