# Reggie Linear Regression Problem

Loops, info -
Off-Platform Project: Reggie’s Linear Regression

I have a problem getting to the same outcome as the solution, in part 2, where it asks to “Try a bunch of slopes and intercepts!”

my code is as follows
possible_ms = [m/10 for m in range(-100, 101, 1)]
possible_bs = [b/10 for m in range(-200, 201, 1)]

this gives me a line of best fit in part 3 of m=0.4 b=1.6 and error=5.0

the solution code is
possible_ms = [m * 0.1 for m in range(-100, 101)]
possible_bs = [b * 0.1 for b in range(-200, 201)]
this gives a line of best fit in part 3 of m=0.3 b=1.7 and error=5.0

I have absolutely no idea why this is occurring when mathematically they are same, hoping someone can tell me why

It’s a mixture of minor floating point errors and a bad dataset.

The biggest issue I recall from that project is that the dataset is so small and scattered that you can fit a wide range of best fit lines all of which have almost identical errors (if for example you did your fitting and error calculation in the reverse order you may find a different answer again). In the long term you may be considering whether or not this the best way to fit this data.

As for why it’s different, it’ll be because you’re performing a slightly different operation (mult vs. div) on a different machine you have ever so slightly different values for your fitting data (your `m` and `b` values are floats and are subject to floating point error). This can easily result in a different fit but it is only as pronounced as it is because the dataset is less than ideal. It is worth looking into floating point errors briefly as they can come back to bite you.

2 Likes