nik
nik

Reputation: 2329

XOR Encryption in Swift IOS

Trying the convert Below Objective C XOR encryption method to Swift but getting error like "Could nt find overload for 'subscript' that accept supplied argument". Any help would be appreciated.

Objective C

+(NSString *) encryptDecrypt:(NSString *)input staticKey:(NSString *) staticKey
{
    const char *key = [staticKey UTF8String];; //Can be any chars, and any size array
    NSMutableString *output = [[NSMutableString alloc] init];

    for(int i = 0; i < input.length; i++) {
        char c = [input characterAtIndex:i];
        c ^= key[i % sizeof(key)/sizeof(char)];
        [output appendString:[NSString stringWithFormat:@"%c", c]];
    }
    return output;
} 

Swift

func encryptDecrypt(input: String, staticKey: String) -> String {
    let cstr = staticKey.cStringUsingEncoding(NSUTF8StringEncoding)
    var output: NSMutableString = NSMutableString()
    for (index, element) in enumerate(input) {
        // for var i = 0; i < input.length; i++ {
        var c: Character = element
        let char = c ^ cstr[index % sizeof(cstr) / sizeof(Character)]
        output.appendString("\(c)")
    }
    return output as String
}

Upvotes: 0

Views: 2545

Answers (1)

hennes
hennes

Reputation: 9342

Disclaimer: As explained in the comments using this kind of bit manipulation on UTF8 strings is unsafe and will not work as expected for arbitrary inputs.

I'm actually not sure whether the original Objective-C code does what you want. sizeof(key) is the size of the memory address of a char pointer (8 on my platform) and not the length of the UTF8 array. On top, sizeof(char) should always be 1. You probably want to use strlen instead.

Anyways, the equivalent of the (corrected) Objective-C code in Swift 2 could like this

func encryptDecrypt(input: String, staticKey: String) -> String? {
    let key = staticKey.utf8
    let bytes = input.utf8.enumerate().map({
        $1 ^ key[key.startIndex.advancedBy($0 % key.count)]
    })
    return String(bytes: bytes, encoding: NSUTF8StringEncoding)
}

The test snippet

let key = "12345"
let string = "abcdefghijklmnopqrstuvwxyz"

let encrypted = encryptDecrypt(string, staticKey: key)!
let decrypted = encryptDecrypt(encrypted, staticKey: key)!

print(string)
print(encrypted)
print(decrypted)

will print out

abcdefghijklmnopqrstuvwxyz
PPPPPWU[]_Z^^ZZACAGADDDLLK
abcdefghijklmnopqrstuvwxyz

For Swift 1.2 you'll have to make a couple of small adaptions:

func encryptDecrypt(input: String, staticKey: String) -> String? {
    let key = staticKey.utf8
    let keyLength = distance(key.startIndex, key.endIndex)
    let bytes = map(enumerate(input.utf8)) {
        $1 ^ key[advance(key.startIndex, $0 % keyLength)]
    }
    return String(bytes: bytes, encoding: NSUTF8StringEncoding)
}

Update: The following snippet is closer to the original Objective-C code and works for arbitrary strings:

func encryptDecrypt(input: NSString, staticKey: NSString) -> NSString? {
    let chars = (0..<input.length).map({
        input.characterAtIndex($0) ^ staticKey.characterAtIndex($0 % staticKey.length)
    })
    return NSString(characters: chars, length: chars.count)
}

Upvotes: 2

Related Questions