pascalbros
pascalbros

Reputation: 16442

Problem with hash256 in Objective C

when i use this code for generate an hash256 in my iPhone app:

 unsigned char hashedChars[32];
  NSString *inputString;
  inputString = [NSString stringWithFormat:@"hello"];
  CC_SHA256([inputString UTF8String],
      [inputString lengthOfBytesUsingEncoding:NSASCIIStringEncoding ], 
      hashedChars);
  NSData * hashedData = [NSData dataWithBytes:hashedChars length:32];

The hash256 of inputString, is created correctly, but if i use a string like this @"\x00\x25\x53\b4", the hash256 is different from the real string with "\x" characters. I think that the problem is in encoding "UTF8" instead of ascii. Thanks!

Upvotes: 0

Views: 840

Answers (2)

tc.
tc.

Reputation: 33592

You're getting the bytes with [inputString UTF8String] but the length with [inputString lengthOfBytesUsingEncoding:NSASCIIStringEncoding]. This is obviously wrong. Moreover (assuming you mean "\xB4" and that it turns into something not in ASCII), "\xB4" is not likely to be in ASCII. The docs for NSString say

Returns 0 if the specified encoding cannot be used to convert the receiver

So you're calculating the hash of the empty string. Of course it's wrong.

You're less likely to have problems if you only generate the data once:

NSData * inputData = [inputString dataUsingEncoding:NSUTF8StringEncoding];
CC_SHA256(inputData.bytes, inputData.length, hashedChars);

Upvotes: 1

Jeff
Jeff

Reputation: 334

I would be suspicious of the first character, "\x00" - thats going to terminate anything that thinks its dealing with "regular C strings".

Not sure whether lengthOfBytesUsingEncoding: takes that stuff into account, but its something I'd experiment with.

Upvotes: 1

Related Questions