Escape character encoding

If I have a string with escape characters on the server side and I want to push it to the client, I noticed that those characters would be converted to 32-bit unicode literals before getting pushed. For example, “\uFEFFHello World” will be converted to “\u0000FEFFHello World” and then gets pushed to client. When client recevies the json, say, {“testString”:“\u0000FEFFHello World”}, it parses it in Javascript by calling JSON.parse() and the parsed strings are stored in a param ValueMap.

But I don’t quite understand the purpose of the conversion to turn a single unicode character into a 32-bit unicode hex literal. Without the conversion, JSON.parse() is able to recognize the escape character and parse it correctly. But with the conversion, the second part of the escape character gets parsed as regular characters. Below is my test code and output:

var jsonStr = ‘{“testString”:“\u0000feffHello World”}’;
var json = JSON.parse(jsonStr);
alert(‘json=’ + json[‘testString’]
);
OUTPUT: json= feffHello World

var jsonStr = ‘{“testString”:“\ufeffHello World”}’;
var json = JSON.parse(jsonStr);
alert(‘json=’ + json[‘testString’]
);
OUTPUT: json= Hello World

Is this a desired behavior? Or am I missing some settings to display the text correctly on the client side? I did some further research and it looks like javascript can handle Unicode code point escapes like this: \u{0000feff}. Shouldn’t JsonUtil.escapeStringAsUnicode() convert the escape character into this format instead of \u0000feff?

Thanks.