Megan Spencer
Megan Spencer

Reputation: 339

How to make golang standardize unicode strings that have multiple ways to be encoded?

It is possible to encode a unicode character in multiple different ways. This is annoying when creating software. For example, the following string can be encoded with two different rune sequences:

νῦν: 957 965 834 957 
νῦν: 957 8166 957 

Is there a function in golang that can stand standardize into one method of encoding? I assume something like mashing 965 834 into 8166.

Sample code for anyone interested in this:

package main

import "fmt"

func main() {
    //r1 := "νῦν"
    //r2 := "νῦν"
    r1 := []rune{957, 965, 834, 957}
    r2 := []rune{957, 8166, 957}


    fmt.Printf("%s %d: ", string(r1), len(r1))
    for i := 0; i < len(r1); i++ {
        fmt.Printf("%d ", r1[i])
    }
    fmt.Printf("\n")

    fmt.Printf("%s %d: ", string(r2), len(r2))
    for i := 0; i < len(r2); i++ {
        fmt.Printf("%d ", r2[i])
    }
    fmt.Printf("\n")
}

Upvotes: 3

Views: 2511

Answers (1)

Megan Spencer
Megan Spencer

Reputation: 339

The golang.org/x/text/unicode/norm package can be used:

func fixUnicode(in string) string {
    return norm.NFC.String(in)
}

Upvotes: 6

Related Questions