I'm belatedly warming up to the idea that the digit length of the number zero is actually zero in various senses. (A footnote in the article points out that we could define it this way and then some students' simpler solutions would actually be considered correct.)<p>Yes, we obviously normally use a single digit to write zero, but we could logically write it as the empty string "" if we had a way to indicate that a specific number was meant. (Relative to that, writing 0 is just as unnecessary as writing 00 or 000, because we can have a rule that all leading zeroes need not be written.)<p>As another example of this intuition, suppose that we wrote all integers with a trailing decimal point. In that case the numbers from 1 to 10 would be "1.", "2.", "3.", "4.", "5.", "6.", "7.", "8.", "9.", and "10.", while zero could quite logically just be "." with no digits (as it has only leading zeroes, or we could say "nothing in the ones' place").<p>Quite a lot of arithmetic algorithms would probably work just fine with these conventions! In fact, they might have fewer exceptions or special cases than they do when they expect an explicit 0.<p>For instance, using the "zero is written as empty string" convention, Python int() (here simplified to assume input strings of zero or more decimal digits) and str() could look like<p><pre><code> def int(s):
n = 0
for c in s:
n *= 10
n += "0123456789".find(c)
return n
def str(n):
s = ""
while n:
n, d = divmod(n, 10)
s = "0123456789"[d] + s
return s
</code></pre>
Seems pretty general! Yes, it's annoying or confusing if the output is going to be viewed by a human user in a larger string context without delimiters, as we probably don't want to see things like "I ate tomatoes" instead of "I ate 0 tomatoes".