This relates to Intro to JavaScript - Exercise 9 - Built-in Objects
As I was messing around with the Number.isInteger() method, I discovered that if you take the decimal out to 16 places the Number.isInteger() method will incorrectly return ‘true’. See below:
let x = 1.0000000000000001;
console.log(Number.isInteger(x));
//prints true
Interestingly, the following code returns ‘false’:
let x = 1.0000000000000002;
console.log(Number.isInteger(x));
//prints false
Strangely enough, this returns ‘true’:
let x = 2.0000000000000002;
console.log(Number.isInteger(x));
//prints true
So I found that as long as the whole number is larger than the 16th decimal spot, isInteger() will incorrectly return ‘true’. See below:
let x = 4.0000000000000003;
console.log(Number.isInteger(x));
//prints true
let x = 5.0000000000000004;
console.log(Number.isInteger(x));
//prints true
…and so on, and so forth.
I think it is interesting that this happens at the 16th decimal place since 16 is a significant number in binary code as I understand it. I did check on the MDN Web Docs to see if there is an explanation for why this is happening but it does not explain. Any information would be helpful.
Thanks!