For loops increment by .1


#1

This is an out of the box question. learning about for loops and was just playing around. something interesting came up which i'd love to know. Maybe I'm getting ahead of myself though. But still worth knowing.

So playing with for loops in the 'introduction to for loops in JS' in the JavaScript course and at activity #4.

I changed the code to increase the # by .1

see code:

for (var i = 4; i < 11; i = i + .1) {
console.log(i);
}

*my question is:

Why does it start to add the extra digit at the end. i.e.
(see whole results below)
1.2000000000000002
1.3000000000000003
1.4000000000000004

and so on.

Shouldn't this just increase by .1 each time.

is this a bug?

I am just super curious to how this works and why there would be that extra digit at the end, which eventually makes different numbers than id expect

I would have expected the output to be
1.1
1.2
1.3
1.4
etc
9.9
10.0
10.1

etc

results in the console below...

1
1.1
1.2000000000000002
1.3000000000000003
1.4000000000000004
1.5000000000000004
1.6000000000000005
1.7000000000000006
1.8000000000000007
1.9000000000000008
2.000000000000001
2.100000000000001
2.200000000000001
2.300000000000001
2.4000000000000012
2.5000000000000013
2.6000000000000014
2.7000000000000015
2.8000000000000016
2.9000000000000017
3.0000000000000018
3.100000000000002
3.200000000000002
3.300000000000002
3.400000000000002
3.500000000000002
3.6000000000000023
3.7000000000000024
3.8000000000000025
3.9000000000000026
4.000000000000003
4.100000000000002
4.200000000000002
4.300000000000002
4.400000000000001
4.500000000000001
4.6000000000000005
4.7
4.8
4.8999999999999995
4.999999999999999
5.099999999999999
5.199999999999998
5.299999999999998
5.399999999999998
5.499999999999997
5.599999999999997
5.699999999999997
5.799999999999996
5.899999999999996
5.999999999999996
6.099999999999995
6.199999999999995
6.2999999999999945
6.399999999999994
6.499999999999994
6.599999999999993
6.699999999999993
6.799999999999993
6.899999999999992
6.999999999999992
7.099999999999992
7.199999999999991
7.299999999999991
7.399999999999991
7.49999999999999
7.59999999999999
7.6999999999999895
7.799999999999989
7.899999999999989
7.9999999999999885
8.099999999999989
8.199999999999989
8.299999999999988
8.399999999999988
8.499999999999988
8.599999999999987
8.699999999999987
8.799999999999986
8.899999999999986
8.999999999999986
9.099999999999985
9.199999999999985
9.299999999999985
9.399999999999984
9.499999999999984
9.599999999999984
9.699999999999983
9.799999999999983
9.899999999999983
9.999999999999982
10.099999999999982
10.199999999999982
10.299999999999981
10.39999999999998
10.49999999999998
10.59999999999998
10.69999999999998
10.79999999999998
10.899999999999979
10.999999999999979


#2

oops sorry
the code should be

for (var i = 1; i < 11; i = i + .1) {
console.log(i);
}

i had started going back to the assignment and copied that code.

for (var i = 1; i < 11; i = i + .1) {
console.log(i);
}

is the code i was using for that example


#3

To learn more about the cause of the behavior you are seeing, read up on floating point arithmetic.


#4

0.1 can't be represented exactly by floats, just like how 1/3 can't be written exactly as a base10 number, you run out of space for all those 3's eventually.

In general you should consider floats to be approximations, if you rely on something being represented exactly, then use only integers.

JavaScript doesn't have an integer type, but its floats can represent integers exactly up to 2^53 - 1. You've still got to watch out for division and other inexact operations (sqrt, sin etc)

I'd write your loop as counting from 1 up to 110 and consider i to be the amount of tenths
You can also use this:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/toFixed
Which will return a string representation with the amount of decimals that you specify (rounds if necessary)
But if something is exact then I prefer not to touch inexact operations at all.