Something about the JavaScript Dog Years lesson

I noticed that the way the math is performed will only be correct if your age is over 2 years. To remedy this, I performed the math in two ways; the way the lesson plan asked me to, and my own way.

Let’s say my name is Baby Boy Doe, and I am one day old. That makes me 0.0027397260273972603 years old in human years which is 0.028767123287671233 in dog years. Codecademy wanted me to do the math differently and would have placed me at 13.01095890410959 dog years, which is incorrect. This means I, baby boy doe, was born yesterday, and am possibly smarter than a professional coder (note: not actually true). Had my age in dog years been accurate, I still would have been only 10.5 days old, however a professional coder thinks I am 13 years and 4 days old (note: only the math dictated this). Even if they were correct on my age in dog years, that’s still extremely embarrassing to be wrong compared to a teenager (note: this is a joke, please laugh now).

This should be the correct way to perform the math:

let anyonesAge; function anyonesAgeInDogYears() { if (anyonesAge <= 2) { return anyonesAge * 10.5; } else { return (anyonesAge - 2) * 4 + 21; } } anyonesAge = 15; console.log(anyonesAgeInDogYears()); //Output: 73 anyonesAge = 0.5; console.log(anyonesAgeInDogYears()); //Output: 5.25 anyonesAge = 409; console.log(anyonesAgeInDogYears()); //Output: 1649

To anyone who actually wanted something that could calculate any age in dog years, I hope this has helped.

1 Like