Reputation: 3139
It works when I just use a string directly, but not with a variable.
console.log(`\u{1F436}`);
const unicode = '1F436';
console.log(`\u{ ${unicode} }`);
I get this error:
Parsing error: Invalid escape sequence in template
What am I missing? How can I properly escape this?
Upvotes: 2
Views: 1767
Reputation:
\u
can only be used as part of a Unicode literal that's present in the source code. It can't be built up through interpolation.
If you want to reference a Unicode codepoint by value, use String.fromCodePoint()
:
var c = 0x1f436;
// or if you wanted to start with a string: c = parseInt("1F436", 16)
console.log(String.fromCodePoint(c));
Upvotes: 4
Reputation: 5840
The reason it doesn't work is because \u
expects a hexadecimal digit after it but you're passing a string. I'm guessing it has to do with how React is parsing it because I'm getting an error as well. But there is a solution:
// Option 1 - Use template literals:
<p>{`\u1F436`}</p>
// Option 2 - pass the full unicode as the variable and output that variable:
const unicode = "\u1F436"
<p>{unicode}</p>
Upvotes: 0