daisy
daisy

Reputation: 23561

golang HTML charset decoding

I'm trying to decode HTML pages that are NOT utf-8 encoded.

<meta http-equiv="Content-Type" content="text/html; charset=gb2312">

Is there any library that can do that? I couldn't find one online.

P.S Of course, I can extract charset and decode the HTML page with goquery and iconv-go, but I'm trying not to re-invent the wheels.

Upvotes: 3

Views: 4203

Answers (2)

zhengchun
zhengchun

Reputation: 1291

Golang officially provides the extension packages: charset and encoding.

The code below makes sure the document can be parsed correctly by the HTML package:

func detectContentCharset(body io.Reader) string {
    r := bufio.NewReader(body)
    if data, err := r.Peek(1024); err == nil {
        if _, name, ok := charset.DetermineEncoding(data, ""); ok {
            return name
        }
    }
    return "utf-8"
}

// Decode parses the HTML body on the specified encoding and
// returns the HTML Document.
func Decode(body io.Reader, charset string) (interface{}, error) {
    if charset == "" {
        charset = detectContentCharset(body)
    }
    e, err := htmlindex.Get(charset)
    if err != nil {
        return nil, err
    }

    if name, _ := htmlindex.Name(e); name != "utf-8" {
        body = e.NewDecoder().Reader(body)
    }

    node, err := html.Parse(body)
    if err != nil {
        return nil, err
    }
    return node, nil
}

Upvotes: 3

yee
yee

Reputation: 1985

goquery may meet your needs. e.g.:

import "https://github.com/PuerkitoBio/goquery"

func main() {
    d, err := goquery.NewDocument("http://www.google.com")
    dh := d.Find("head")
    dc := dh.Find("meta[http-equiv]")
    c, err := dc.Attr("content") // get charset
    // ...
}

more operations can be found with the Document struct.

Upvotes: 0

Related Questions