Division by Zero

What is any number divided by 0? Undefined. What is 0 divided by 0? Indeterminate.

How are the two different? Here’s a math forum page informally and discreetly explaining “undefined” as something where the answer does not exist while “indeterminate” as something that “if it pops up somewhere, you don’t know what its value will be in your case.”

Either take that or go read what Prof. Arsham has to say about the ordeal that is division by zero. The article presents cultural, historical, mathematical and psychological perspectives on the number representing nothing. Aptly entitled the Zero Saga, he summarizes the whole point of going at length about the common misconceptions involving dividing by zero:

If one does allow oneself dividing by zero, then one ends up in a hell. That is all.

And he holds his case against others well, such as when he retorts the following when a reader writes of the fact that dividing 1 apple into 0 equal parts still yields an apple:

Have you really attempted in doing so? I am sure you failed, Right? So do not conclude anything.

And in quite the similar vein, someone asks:

If you interpret 20/5 =4 means that if you take 5 oranges from a total of 20 oranges in your fridge you can do it 4 times. Then if you take 0 oranges from 2 oranges (2/0) you can take it infinite number of times (that is, it does not end but surely exists/continues).

He then points out the absurdity of the operation in his response:

If you “take 0 oranges from 2 oranges”, it means 2-0=2. Repeating this operation again and again is nonsense. Once is enough, right? Otherwise eventually you get tired of counting this repetition, beyond that is infinity which you have never reached.

Prior to encountering the read, I never realized how much zero made things much more difficult for us with its introduction in the 13th century. Sure I still remember the additive inverse and difficult stuff stemming from consequences brought about by 0’s existence back in college. But it still took that piece touching lots of the higher math topics to have me pondering how much a seemingly insignificant number caused that much trouble in the number system.

Time to Unlearn

I don’t recall now but I remember someone pointing out that a lot of the things learned in life later are just really about unlearning things taught to one earlier. While the vein of truth was evident there, it was just earlier when my wife and I were taking a look at the contents of my daughter’s Math book for the incoming school year did the impact of the truism there strike me.

When I was in college I remember a written communications mention how she found it hard to interact with high school teachers given their fondness of what she referred to as common errors. From time to time she made it a point to correct every error we had both written and spoken. Maybe it was her way of promulgating correctness the way she deemed apt.

Back then I didn’t care really because I was there only for the credits. With the following though, I think I now got to understand more what she had already realized then.

Preposition or Conjunction?

If you add 1 to 19, the result is an even number. (Yes/No)

I might be the only one to notice but adding 1 to 19 is not the same as adding 1 and 19. It might be plausible that the latter is school kid material but I’d have a hard time believing n(n+1)/2 has just been rendered grade 1 stuff already in my lifetime that fast.

Theories

If you put together two even numbers, the answer is also an even number. (Yes/No)

Unless they actually expect grade 1 pupils to prove that assertion from a given definition, (which should require at least basic knowledge in algebra,) I see no point in asking this theoretical question here. Sure the kid could get away with thinking of two even numbers, say 0 and 2, and adding them to check whether the statement holds.  In the example given it expectedly turns out true. It in fact comes out true for any two even numbers for the particular question.

I find an issue however with how this inculcates and effectively encourages a common mistake among students: proof by example. I have seen for a fact how even a lot of graduates resort to such kind of reasoning. It doesn’t mean it’s right however.

First there’s no actually such thing, (maybe they had proof by counterexample in mind, I don’t know.) Second and more to the point as my Math professor in college told someone when asked why the student’s proof for a proving question in an exam was wrong. Upon knowing that an example was presented he quickly pointed out and emphasized in class that:

An example is not a proof. What may be true for some members of a given domain may not be true for its other members.

Well at least that’s how I remember how he worded the point. At least for me, it would be best not to let kids start off with something they will have to eventually unlearn in the future. After all it shouldn’t be long before they would eventually encounter ambiguity in pushing for such mindset when they resolve a claim that all prime numbers are odd.

What if someone gets to think of 2?