My friend Dr24hours sent me a link via twitter to a new piece of mathematical crackpottery. It’s the sort of thing that’s so trivial that I might lust ignore it – but it’s also a good example of something that someone commented on in my previous post.

This comes from, of all places, Rolling Stone magazine, in a puff-piece about an actor named Terrence Howard. When he’s not acting, Mr. Howard believes that he’s a mathematical genius who’s caught on to the greatest mathematical error of all time. According to Mr. Howard, the product of one times one is *not* one, it’s *two*.

After high school, he attended Pratt Institute in Brooklyn, studying chemical engineering, until he got into an argument with a professor about what one times one equals. “How can it equal one?” he said. “If one times one equals one that means that two is of no value because one times itself has no effect. One times one equals two because the square root of four is two, so what’s the square root of two? Should be one, but we’re told it’s two, and that cannot be.” This did not go over well, he says, and he soon left school. “I mean, you can’t conform when you know innately that something is wrong.”

I don’t want to harp on Mr. Howard too much. He’s clueless, but sadly, he’s a not too atypical student of american schools. I’ll take a couple of minutes to talk about what’s wrong with his stuff, but in context of a discussion of where I think this kind of stuff comes from.

In American schools, math is taught largely by rote. When I was a kid, set theory came into vogue, but by and large math teachers didn’t understand it – so they’d draw a few meaningless Venn diagrams, and then switch into pure procedure.

An example of this from my own life involves my older brother. My brother is not a dummy – he’s a *very* smart guy. He’s at least as smart as I am, but he’s interested in very different things, and math was never one of his interests.

I barely ever learned math in school. My father noticed pretty early on that I really enjoyed math, and so he did math with me for fun. He taught me stuff – not as any kind of “they’re not going to teach it right in school”, but just purely as something *fun* to do with a kid who was interested. So I learned a lot of math – almost everything up through calculus – from him, not from school. My brother didn’t – because he didn’t enjoy math, and so my dad did other things with him.

When we were in high school, my brother got a job at a local fast food joint. At the end of the year, he had to do his taxes, and my dad insisted that he do it himself. When he needed to figure out how much tax he owed on his income, he needed to compute a percentage. I don’t know the numbers, but for the sake of the discussion, let’s say that he made $5482 that summer, and the tax rate on that was 18%. He wrote down a pair of ratios:

And then he cross-multiplied, getting:

and so .

My dad was shocked by this – it’s such a laborious way of doing it. So he started pressing at my brother. He asked him, if you went to a store, and they told you there was a 20% off sale on a pair of jeans that cost $18, how much of a discount would you get? He didn’t know. The only way he knew to figure it out was to do the whole ratios, cross-multiply, and solve. If you told him that 20% off of $18 was $5, he would have believed you. Because percentages just didn’t *mean* anything to him.

Now, as I said: my brother isn’t a dummy. But none of his math teachers had every taught him what percentages *meant*. He had no concept of their meaning: he knew a *procedure* for getting the value, but it was a completely blind procedure, devoid of meaning. And that’s what everything he’d learned about math was like: meaningless procedures performed by rote, without any comprehension.

That’s where nonsense like Terence Howard’s stuff comes from: math education that never bothered to teach students what anything *means*. If anyone had attempted to teach any form of *meaning* for arithmetic, the ridiculous of Mr. Howard’s supposed mathematics would be obvious.

For understanding basic arithmetic, I like to look at a geometric model of numbers.

Put a dot on a piece of paper. Label it “0”. Draw a line starting at zero, and put tick-marks on the line separated by equal distances. Starting at the first mark after 0, label the tick-marks 1, 2, 3, 4, 5, ….

In this model, the number one is the distance from 0 (the start of the line) to 1. The number two is the distance from 0 to 2. And so on.

What does addition mean?

Addition is just stacking lines, one after the other. Suppose you wanted to add 3 + 2. You draw a line that’s 3 tick-marks long. Then, starting from the end of that line, you draw a second line that’s 2 tick-marks long. 3 + 2 is the length of the resulting line: by putting it next to the original number-line, we can see that it’s five tick-marks long, so 3 + 2 = 5.

Multiplication is a different process. In multiplication, you’re not putting lines tip-to-tail: you’re building rectangles. If you want to multiply 3 * 2, what you do is draw a rectangle who’s width is 3 tick-marks long, and whose height is 2 tick-marks long. Now divide that into squares that are 1 tick-mark by one tick-mark. How many squares can you fit into that rectangle? 6. So 3*2 = 6.

Why does 1 times 1 equal 1? Because if you draw a rectangle that’s one hash-mark wide, and one hash-mark high, it forms exactly one 1×1 square. 1 times 1 can’t be two: it forms one square, not two.

If you think about the repercussions of the idea that 1*1=2, as long as you’re clear about meanings, it’s pretty obvious that 1*1=2 has a disastrously dramatic impact on math: it turns all of math into a pile of gibberish.

What’s 1*2? 2. 1*1=2 and 1*2=2, therefore 1=2. If 1=2, then 2=3, 3=4, 4=5: all integers are equal. If that’s true, then… well, numbers are, quite literally, meaningless. Which is quite a serious problem, unless you already believe that numbers are meaningless anyway.

In my last post, someone asked why I was so upset about the error in a math textbook. This is a good example of why. The new common core math curriculum, for all its flaws, does a better job of teaching understanding of math. But when the book teaches “facts” that are wrong, what they’re doing becomes the opposite. It doesn’t make sense – if you actually try to understand it, you just get more confused.

That teaches you one of two things. Either it teaches you that understanding this stuff is futile: that all you can do is just learn to blindly reproduce the procedures that you were taught, without understanding why. Or it teaches you that *no one* really understands any of it, and that therefore nothing that anyone tells you can possibly be trusted.