Wingzero
Wingzero

Reputation: 9754

Python how to encode 0x90(\x90) as macOS Roman Encoding to \xc3\xaa

In python, if giving 0x90(or \x90), how to encode it into a string as macOS Roman Encoding to \xc3\xaa, aka ê?

I tried bytes('\x90').encode('mac-roman'), it just throw errors.

chr(0x90).encode('mac-roman') is still the same failure.

Please help, thanks!

Upvotes: 0

Views: 2504

Answers (2)

Wingzero
Wingzero

Reputation: 9754

Actually, after searching two web pages: http://unicode.org/Public/MAPPINGS/VENDORS/APPLE/ROMAN.TXT it shows:

0x90 0x00EA # LATIN SMALL LETTER E WITH CIRCUMFLEX

and http://www.fileformat.info/info/unicode/char/ea/index.htm,:

UTF-8 (hex) 0xC3 0xAA (c3aa)

UTF-8 (binary) 11000011:10101010

UTF-16 (hex) 0x00EA (00ea)

and trying decoding/encoding, I found it's:

'\x90'.decode('mac-roman').encode('utf-8')

Upvotes: 1

keepAlive
keepAlive

Reputation: 6665

It depends on what you final goal is, but as such it is encoded, so you first need to decode it, i mean, e.g.

chr(0x90).decode('mac-roman')

You were almost there.

Upvotes: 1

Related Questions