Why does counting start from 0?


Why does counting start from 0?


In short, just remember that almost all programming languages start counting from zero because it’s optimal.
A famous Dutch computer scientist, Edsger Dijkstra, gives a great mathematical explanation for this! It basically comes down to optimization, which is a common reason behind computer system design choices like this. Dijkstra says, “when starting with subscript 1, the subscript range 1 ≤ i < N+1; starting with 0, however, gives the nicer range 0 ≤ i < N,” in his paper called “Why numbering should start at zero.”


2 posts were split to a new topic: How to Print the Length of a String?

Dijkstra’s algorithm solves the single-source shortest path problem with non-negative edge weight.

1 Like

AppleScript is one language I know that doesn’t start at “0” which messes with a lot of scripters. Then again, AppleScript is a scripting language not a code language.

1 Like

For reference:


The paper mentioned in the OP is discussed in that article along with exploration into other fields. A good read.


R isn’t zero-indexed either.