Math.random() works a bit differently than 0 or 1

This is a “why is Math.random() like this” more than a “I don’t understand Math.random()”

So according to our definitions it spits out a number 0-1. But it never gives me 0.093656 or 0.003637676 technically speaking 0.009… is between 0 and 1, it is larger than 0, less than 1…so I wonder “can Math.random() ever give me a float with one or more zero after the decimal point?”

I thought it was a head scratcher do you know why it does this?

Thanks Folx!

Let’s try an experiment.

const y = []
let x = 1

while (x > 0) {
  x = Math.random()
  y.push(x)
}

When the code stops,

console.log(y.toString())

Examine the results, assuming there are any. It is likely you will run out of memory, but there is the likelihood of a zero, if only very slight.

Change the 0 to 0.05 for a definite result.

Eg.

0.019817715916521816
1 Like