Reputation: 3072
I’ve a strange problem. In my code I create a sha512 hash algorithm and sign some data with it. Now I would expect the hash to be 512 bits long, but it’s 128. What mistake have I made?
My Code:
var hashAlgorithm = HashAlgorithm.Create( „SHA512“ );
var signedhHash = rsaCryptoServiceProvider.SignData( plainData, hashAlgorithm );
PS. I’ve loaded the RSA keys from a file, which I’ve created with the following script:
makecert -r -n "CN=myCert" -sky exchange -sy 24 -sv myCert.pvk myCert.cer
cert2spc myCert.cer myCert.spc
pvk2pfx -pvk myCert.pvk -spc myCert.spc -pfx myCert.pfx –f
Edit:
I got the length from signedhHash.Length
, which is equal to 128.
Upvotes: 4
Views: 1278
Reputation: 46947
The hash value is 512 in size, but the signed hash value (from SignData
) is 1024 in size.
var alg = HashAlgorithm.Create("SHA512");
var hashArr = alg.ComputeHash(ASCIIEncoding.UTF8.GetBytes("test"));
var size = hashArr.Length * 8; //512
var rsaCryptoServiceProvider = new RSACryptoServiceProvider();
var signedValue = rsaCryptoServiceProvider.SignData(ASCIIEncoding.UTF8.GetBytes("test"), alg);
size = signedValue.Length * 8; //1024
(1 byte = 8 bits)
Upvotes: 3
Reputation: 8402
The SignData does not return the hash made with the SHA512 algorithm it returns the signature for that hash made with the SignatureAlgorithm property of the provider (http://www.w3.org/2000/09/xmldsig#rsa-sha1)
Upvotes: 0