Why can I get the length of 5 recorded to the console when the length of the number is greater than 21?
var metricUnits = function(num) {
console.log(num.toString().length);
};
// This works, gives me 21
metricUnits(900000000000000000000);
// But this fails, gives me 5
metricUnits(9000000000000000000000);
When I call this function, 21 is written to the console. However, when I add one or more zeros to the end of my input argument, 5 is printed to the console ?! Why is this?
source to share
If you logged the result to the console, it would be obvious. Above a certain limit, numbers are represented by scientific notation (when requested as their string representation). In your second case 9e+21
- 5 characters
Detailed information about the limit can be found in this question.
source to share
When the number is too large, it looks like this: 9e+21
So when you do .toString().length
it will return 5.
An alternative way of counting the number of digits to avoid the problem:
var metricUnits = function(num) {
log(Math.floor(Math.log10(num)) + 1);
};
log("900000000000000000000 =>");
metricUnits(900000000000000000000); // 21
log("9000000000000000000000 =>");
metricUnits(9000000000000000000000); // 22
function log(msg) {
document.body.insertAdjacentHTML(
"beforeend",
"<pre>" + msg + "</pre>"
);
}
source to share