Getting a strange result when adding decimal places

This isn’t really a problem, but it is a very odd thing, and I’m curious as to why it’s happening.

I’m not sure if I’ve made a mistake or found some kind of bug, but I’m trying to add a series of numbers with decimal places (I’m doing the sleep debt calculator task), and the result is sometimes incorrect if I include 7.7 as one of 7 numbers to be summed:

console.log(7.1+7.2+7.3+7.4+7.5+7.6+7.7);
console.log(7.1+7.2+7.3+7.4+7.5+7.6+7.8);
console.log(7.1+7.2+7.3+7.4+7.5+7.6+7.6);
console.log(7.1+7.2+7.3+7.4+7.5+7.7);
console.log(7.2+7.2+7.3+7.4+7.5+7.6+7.7);
console.log(7.7+7.2+7.3+7.4+7.5+7.6+7.1);
console.log(6.1+6.2+6.3+6.4+6.5+6.6);
console.log(6.1+6.2+6.3+6.4+6.5+6.6+7.7);

returns:

51.800000000000004
51.9
51.7
44.2
51.900000000000006
51.800000000000004
38.1
45.800000000000004

I can’t for the life of me understand why I get the extra fraction. The position doesn’t matter, and it doesn’t seem to be down to the number of sums. It’s also specific to 7 sums including 7.7. 6 sums including 6.6 doesn’t do it.

It’s easily fixed by just using 7.8 instead of 7.7, so it’s not really a problem, I’m just very curious as to why this might be!

Hi @javajumper23544 and welcome to the forum!
Since it is a common thing, that topic has been discussed here recently, but I can’t find it. But I think, that’s a pretty good explanation of that phenomenon:
https://medium.com/@DominicCarmel/understanding-javascripts-weird-decimal-calculations-e65f0e1adefb

1 Like

oooh thank you! very interesting!
I did some googling but all the results wanted to teach me basic maths haha

1 Like