Reputation: 2964
I'm trying to do some tests of copying speed on our WAN. As I'd somewhat suspected, using the File.Copy(source, dest) .NET function seems to get faster on the 2nd and subsequent run. I suspect either my corporate network is doing some crafty caching, or windows is.
What's the best way to avoid the risk of this happening? Would renaming the source file to a random string each time be sensible, or is there a cleverer way to circumvent it?
Upvotes: 2
Views: 410
Reputation: 2964
I think I'll close this. I think perhaps the best way is to generate a random file (doing something like: Creating a Random File in C#) and transfer that.
I also found the caching mainly only affected local copying, which was less of a concern than the network ones I was trying to measure.
Upvotes: 2
Reputation:
I believe it is the file caching working on the remote system. When a file is requested the first time file caching mechanism caches that file in the RAM in anticipation of future requests and serves the subsequent requests from RAM. This only reduces the time taken for reading the file from the local storage and start serving them and not the transfer of them between the 2 systems.
Normally corporations deploy cache boxes for serving files over the web based resources(for intranet and internet), I am not aware of any cache box mechanism for doing it for file shares.
Upvotes: 0