Differences of random entire functions
while checking some code I found the following random number generator function:
function randomInt(min, max) {
return min ? min : min = 0,
max ? max : max = 1,
0 | Math.random() * (max - min + 1) + min
}
Comparing it to the equivalent function on MDN :
// Returns a random integer between min (included) and max (excluded)
// Using Math.round() will give you a non-uniform distribution!
function getRandomInt(min, max) {
return Math.floor(Math.random() * (max - min)) + min;
}
I understand that the first one creates and an integer with max
included and that it checks the values ββor assigns default values ββto them min
and max
, but I don't understand how it returns an integer and not a float without a method Math.floor()
.
Is this achieved with an expression 0 | Math.random() * (max - min + 1) + min
? If so, how?
source to share
The result is converted to an integer with an operator |
that is bitwise OR . From MDN, the first step in calculating the result:
Operands are converted to thirty-two bit integers and expressed as a series of bits (zeros and ones).
And since you are ORing from 0, this operation will not change the value of the result (other than the previously mentioned conversion).
source to share