Michael Lynch
Michael Lynch

Reputation: 3139

How can I render unicode in javascript using a variable?

It works when I just use a string directly, but not with a variable.

Works

console.log(`\u{1F436}`);

Doesn't Work

const unicode = '1F436';

console.log(`\u{ ${unicode} }`);

I get this error:

Parsing error: Invalid escape sequence in template

What am I missing? How can I properly escape this?

Upvotes: 2

Views: 1767

Answers (2)

user149341
user149341

Reputation:

\u can only be used as part of a Unicode literal that's present in the source code. It can't be built up through interpolation.

If you want to reference a Unicode codepoint by value, use String.fromCodePoint():

var c = 0x1f436;
// or if you wanted to start with a string: c = parseInt("1F436", 16)

console.log(String.fromCodePoint(c));

Upvotes: 4

I'm Joe Too
I'm Joe Too

Reputation: 5840

The reason it doesn't work is because \u expects a hexadecimal digit after it but you're passing a string. I'm guessing it has to do with how React is parsing it because I'm getting an error as well. But there is a solution:

// Option 1 - Use template literals:
<p>{`\u1F436`}</p>

// Option 2 - pass the full unicode as the variable and output that variable:
const unicode = "\u1F436"
<p>{unicode}</p>

Upvotes: 0

Related Questions