Hi @scriptmaster83175 ,

That's a good question to ask. When doing math, one should go beyond merely following the rules or the formula in order to consider *why* a particular technique is used.

The instruction to which you refer states:

`04. for each score in scores: Compute its squared difference: (average - score) ** 2 and add that to variance.`

Variance is a measure of the degree to which a set of numbers is spread out. Note that there are numerous other ways of measuring the variation among data values. When we use variance, we square the difference between each value and the mean in order to place special emphasis on the values that differ the most from the mean. It is a way of assuring that if there are a few values that differ markedly from the mean, that fact is represented well in the result.

As an alternative method for representing the degree to which data is spread out, we could compute a sum of absolute differences, instead of summing the squares of the differences. This would place less emphasis on the values that differ markedly from the mean. If, instead, we wish to place a greater emphasis on the outlying values, we could use the sum of the absolute values of the cubes of the differences. But variance is used more often than these other techniques.

Note that it is important to use absolute values of differences, rather than merely adding up the differences, because we don't want the positive and negative differences to cancel each other out. Since squares of real numbers are always positive, we don't have to worry about this when we use variance.