Reputation: 2281
Is it possible to create an invalid UTF8 string using Javascript?
Every solution I've found relies String.fromCharCode
which generates undefined
rather than an invalid string. I've seen mention of errors being generated by ill-formed UTF8 string (i.e. https://developer.mozilla.org/en-US/docs/Web/API/WebSocket#send()) but I can't figure out how you would actually create one.
Upvotes: 5
Views: 3614
Reputation: 90736
One way to generate an invalid UTF-8 string with JavaScript is to take an emoji and remove the last byte.
For example, this will be an invalid UTF-8 string:
const invalidUtf8 = '🐶🐶🐶'.substr(0,5);
Upvotes: 5
Reputation: 20772
A string in JavaScript is a counted sequence of UTF-16 code units. There is an implicit contract that the code units represent Unicode codepoints. Even so, it is possible to represent any sequence of UTF-16 code units—even unpaired surrogates.
I find String.fromCharCode(0xd801)
returns the replacement character, which seems quite reasonable (rather than undefined
). Any text function might do that but, for efficiency reasons, I'm sure that many text manipulations would just pass invalid sequences through unless the manipulation required interpreting them as codepoints.
The easiest way to create such a string is with a string literal. For example, "\uD83D \uDEB2"
or "\uD83D"
or "\uDEB2"
instead of the valid "\uD83D\uDEB2"
.
"\uD83D \uDEB2".replace(" ","")
actually does return "\uD83D\uDEB2"
("🚲"
) but I don't think you should count on anything good coming from a string that isn't a valid UTF-16 encoding of Unicode codepoints.
Upvotes: 3