Number.isInteger() method returns true on decimal out to 16 places, why?

This relates to Intro to JavaScript - Exercise 9 - Built-in Objects

As I was messing around with the Number.isInteger() method, I discovered that if you take the decimal out to 16 places the Number.isInteger() method will incorrectly return ‘true’. See below:

let x = 1.0000000000000001;
console.log(Number.isInteger(x));
//prints true

Interestingly, the following code returns ‘false’:

let x = 1.0000000000000002;
console.log(Number.isInteger(x));
//prints false

Strangely enough, this returns ‘true’:

let x = 2.0000000000000002;
console.log(Number.isInteger(x));
//prints true

So I found that as long as the whole number is larger than the 16th decimal spot, isInteger() will incorrectly return ‘true’. See below:

let x = 4.0000000000000003;
console.log(Number.isInteger(x));
//prints true
let x = 5.0000000000000004;
console.log(Number.isInteger(x));
//prints true

…and so on, and so forth.

I think it is interesting that this happens at the 16th decimal place since 16 is a significant number in binary code as I understand it. I did check on the MDN Web Docs to see if there is an explanation for why this is happening but it does not explain. Any information would be helpful.

Thanks!

Gotta go to the holy grail for that:

ECMA manual itself cue holy music:

https://262.ecma-international.org/11.0/#sec-ecmascript-language-types-number-type

Javascript is not really designed for that type of precision (usually academic or scientific), but I think there are a few solutions out there like this one for example: sinful.js/sinful.js at master · guipn/sinful.js · GitHub

3 Likes

thanks toastedpitabread! i will check it out!