Characters with ASCII> 128 are not read correctly in Javascript

I have HTML that contains a Javascript file. This script contains a special character, ASCII 152. When I try to display charCodeAt, I get different results, but never correct. Could you get some advice? Thanks to

test.html

<script type="text/javascript" charset=SEE BELOW src="test.js">
</script>

      

ANSI Coded TEST.JS File

function d(a)
{
a=(a+"").split("");
alert(a[1].charCodeAt(0));
};
d("i˜g"); // Note that ˜ is 152 in ASCII

      

  • TEST.HTML with x-user-defined charset: warning shows 63384. With% 63232 works since every char> 128 is displayed as 63232+ char.
  • TEST.HTML with utf-8 encoding: displays warning 65533. All characters> 128 are displayed as 65533.
  • TEST.HTML with Windows-1252 encoding: warning shows 752. I cannot find the relationship between ASCII and what is displayed.

TEST.JS file with UTF-8 encoding

function d(a)
{
a=(a+"").split("");
alert(a[1].charCodeAt(0));
};
d("i[x98]g"); // Note that x98 is 152

      

  • TEST.HTML with x-user-defined charset: warning displays 65533. All characters> 128 are displayed as 65533.
  • TEST.HTML with utf-8 encoding: displays warning 65533. All characters> 128 are displayed as 65533.
  • TEST.HTML encoded Windows-1252: Displays warning 65533. All characters> 128 are displayed as 65533.
+3


source to share


2 answers


In no utf8 characters in the range 128-255 and ends at the full ASCII 127 ... Also, the character at position 1

in "i[x98]g"

is "["

, "[x98]"

does not make sense.

Your function can be replaced with str.charCodeAt(1)

.



The character ˜

is Unicode Character 'SMALL TILDE' (U + 02DC and can be written as "\u02DC"

orString.fromCharCode(732)

+5


source


ASCII only has 127 characters. char 152 does not exist



0


source







All Articles