Reputation: 3673
I have to sync large files across some machines. The files can be up to 6GB in size. The sync will be done manually every few weeks. I cant take the filename into consideration because they can change anytime.
My plan is to create checksums on the destination PC and on the source PC and then copy all files with a checksum, which are not already in the destination, to the destination. My first attempt was something like this:
using System.IO;
using System.Security.Cryptography;
private static string GetChecksum(string file)
{
using (FileStream stream = File.OpenRead(file))
{
SHA256Managed sha = new SHA256Managed();
byte[] checksum = sha.ComputeHash(stream);
return BitConverter.ToString(checksum).Replace("-", String.Empty);
}
}
The Problem was the runtime:
- with SHA256 with a 1,6 GB File -> 20 minutes
- with MD5 with a 1,6 GB File -> 6.15 minutes
Is there a better - faster - way to get the checksum (maybe with a better hash function)?
Upvotes: 146
Views: 142975
Reputation: 2126
You can have a look to XxHash.Net ( https://github.com/wilhelmliao/xxHash.NET )
The xxHash algorythm seems to be faster than all other.
Some benchmark on the xxHash site : https://github.com/Cyan4973/xxHash
PS: I've not yet used it.
Upvotes: 4
Reputation: 1519
As Anton Gogolev noted, FileStream reads 4096 bytes at a time by default, But you can specify any other value using the FileStream constructor:
new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.ReadWrite, 16 * 1024 * 1024)
Note that Brad Abrams from Microsoft wrote in 2004:
there is zero benefit from wrapping a BufferedStream around a FileStream. We copied BufferedStream’s buffering logic into FileStream about 4 years ago to encourage better default performance
Upvotes: 58
Reputation: 20775
I know that I am late to party but performed test before actually implement the solution.
I did perform test against inbuilt MD5 class and also md5sum.exe. In my case inbuilt class took 13 second where md5sum.exe too around 16-18 seconds in every run.
DateTime current = DateTime.Now;
string file = @"C:\text.iso";//It's 2.5 Gb file
string output;
using (var md5 = MD5.Create())
{
using (var stream = File.OpenRead(file))
{
byte[] checksum = md5.ComputeHash(stream);
output = BitConverter.ToString(checksum).Replace("-", String.Empty).ToLower();
Console.WriteLine("Total seconds : " + (DateTime.Now - current).TotalSeconds.ToString() + " " + output);
}
}
Upvotes: 4
Reputation: 115867
The problem here is that SHA256Managed
reads 4096 bytes at a time (inherit from FileStream
and override Read(byte[], int, int)
to see how much it reads from the filestream), which is too small a buffer for disk IO.
To speed things up (2 minutes for hashing 2 Gb file on my machine with SHA256, 1 minute for MD5) wrap FileStream
in BufferedStream
and set reasonably-sized buffer size (I tried with ~1 Mb buffer):
// Not sure if BufferedStream should be wrapped in using block
using(var stream = new BufferedStream(File.OpenRead(filePath), 1200000))
{
// The rest remains the same
}
Upvotes: 135
Reputation: 567
I did tests with buffer size, running this code
using (var stream = new BufferedStream(File.OpenRead(file), bufferSize))
{
SHA256Managed sha = new SHA256Managed();
byte[] checksum = sha.ComputeHash(stream);
return BitConverter.ToString(checksum).Replace("-", String.Empty).ToLower();
}
And I tested with a file of 29½ GB in size, the results were
I am running an i5 2500K CPU, 12 GB ram and a OCZ Vertex 4 256 GB SSD drive.
So I thought, what about a standard 2TB harddrive. And the results were like this
So I would recommend either no buffer or a buffer of max 1 mill.
Upvotes: 10
Reputation: 3673
Ok - thanks to all of you - let me wrap this up:
Upvotes: 19
Reputation: 2500
You're doing something wrong (probably too small read buffer). On a machine of undecent age (Athlon 2x1800MP from 2002) that has DMA on disk probably out of whack (6.6M/s is damn slow when doing sequential reads):
Create a 1G file with "random" data:
# dd if=/dev/sdb of=temp.dat bs=1M count=1024
1073741824 bytes (1.1 GB) copied, 161.698 s, 6.6 MB/s
# time sha1sum -b temp.dat
abb88a0081f5db999d0701de2117d2cb21d192a2 *temp.dat
1m5.299s
# time md5sum -b temp.dat
9995e1c1a704f9c1eb6ca11e7ecb7276 *temp.dat
1m58.832s
This is also weird, md5 is consistently slower than sha1 for me (reran several times).
Upvotes: 2
Reputation: 526
Invoke the windows port of md5sum.exe. It's about two times as fast as the .NET implementation (at least on my machine using a 1.2 GB file)
public static string Md5SumByProcess(string file) {
var p = new Process ();
p.StartInfo.FileName = "md5sum.exe";
p.StartInfo.Arguments = file;
p.StartInfo.UseShellExecute = false;
p.StartInfo.RedirectStandardOutput = true;
p.Start();
p.WaitForExit();
string output = p.StandardOutput.ReadToEnd();
return output.Split(' ')[0].Substring(1).ToUpper ();
}
Upvotes: 24
Reputation: 51719
Don't checksum the entire file, create checksums every 100mb or so, so each file has a collection of checksums.
Then when comparing checksums, you can stop comparing after the first different checksum, getting out early, and saving you from processing the entire file.
It'll still take the full time for identical files.
Upvotes: 77