The code below (the solution for the exercise - Fix The Broken Code) seems to be valid and working; however, when I want to console.log(die1), it returns " ReferenceError: die1 is not defined at Object. ". Any ideas why?
In mathematical terms, think of random as a linear equation…
r = ax + b
where x = { x | 0 <= x < 1; x is Real }; and,
where a = { a | a != 0; a is Real or Integer }; and,
where b = { b | b is Real or Integer }.
That expresses the full range of possibilities of a random number, whether positive or negative. We can narrow it down by using only integers for a and b, and making a positive.
b can be negative or positive since it is only a shift in the y axis but has no bearing on a * x. This works great when we have a set midpoint and want to echo above or below that midpoint.
Relating this to this question, x is the Math.random() expression; a is the number we multiply by, namely the number of sides of the die; and b is the offset. Since we don’t want zero, and we want 6 then an offset of 1 will shift the numbers from, [0..5] to [1..6].
I want to ask why this code always renders variables die1 and die2 as “1”… why do I have to to put whole (Math.random()) * 6 + 1 expression in parentheses ?
this will give you a random float between 1 and 6 (inclusive) (so for example: 5.4432453898093)
then you could floor this:
Math.floor(temp)
the problem with your code is that first a random number between 0 and 1 (exclusive) is generated, that is floored (rounded down to zero), multiplied by 6 (which is zero), then one is added which will always give one
I didn’t like the 6 + 1 at the end, so I was wondering if this code is ok:
const rollTheDice = () => {
let die1 = Math.ceil(Math.random() * 6);
let die2 = Math.ceil(Math.random() * 6);
return die1 + die2;
This is mine. I am having the same issue. I understand there is an error because die is in the function as a local code…but why tell us to log it then? I don’t see the point.