This is something that came up in some of the comments on the recent “nimbers” post, and I thought it was worth promoting to the front, and getting up under an easy-to-find title in the “basics” series.
In a lot of discussions in all different areas of math, you encounter talk about sets and classes, and you’ll find people worried about whether they’re talking about sets or classes. What’s the difference? I mentioned this once before, but it’s buried in a discussion of the concept of “meta”, which is why I thought it was worth moving it to its own top-level post: if you don’t know the difference, you’re not going to look in the body of a discussion about the concept of going meta to find the explanation!
I’ll start with just the definitions, and then I’ll dive into the discussion of why we make the distinction.
- A class is any collection of things which have some common property that defines them: the class of logical statements, the class of numbers.
- A set is a class which is a member of a class.
- A proper class is a class which is not a set.
I’ve used the term innumeracy fairly often on this blog, and I’ve had a few people write to ask me what it means. It’s also, I think, a very important idea.
Innumeracy is math what illiteracy is to reading. It’s the fundamental lack of ability to understand or use numbers or math. And like illiteracy, true innumeracy is relatively rare, but there are huge numbers of people who, while having some minimal understanding of number and arithmetic, are functionally innumerate: they are not capable of anything but the most trivial arithmetic; and how anything more complicated than simple basic arithmetic actually works is a total mystery to them.
For the basics, I wrote a bunch of stuff about sorting. It seems worth taking a moment
to talk about something related: binary search. Binary search is one of the most important
and fundamental algorithms, and it shows up in sorts of places.
It also has the amazing property that despite being simple and ubiquitous, it’s virtually
always written wrong. There’s a bit of subtlety in implementing it correctly, and virtually
everyone manages to put off-by-one indexing errors into their implementations. (Including me; last time I implemented a binary search, the first version included one of the classic errors.) The errors are so ubiquitous that even in a textbook that discusses the fact that most programmers get it wrong, they got it wrong in their example code!
This came up in a question in the post where I started to talk about π-calculus, but I thought it was an interesting enough topic to promote it up to a top-level post. If you listen to anyone talking about computers or software, there are three worlds you’ll constantly hear: parallel, concurrent, and distributed. At first glance, it sounds like they mean the same thing, but in fact, they’re three different things, and the differences are important.
Today’s bit of basics is inspired by that bastion of shitheaded ignorance, Dr. Michael Egnor. In part of his latest screed (a podcast with Casey Luskin of the Discovery Institute), Egnor discusses antibiotic resistance, and along the way, asserts that the theory of evolution has no relevance to antibiotic resistance, because what evolution says about the subject is just
a tautology. (I’m deliberately not linking to the podcast; I will not help increase the hit-count that DI will use to promote it’s agenda of willful ignorance.)
So what is a tautology?
A tautology is a logical statement which is universally true, by nature of its fundamental structure. That is, even without knowing anything about what the statement means,
you can infer that it must be true.
I’ve been getting so many requests for “basics” posts that I’m having trouble keeping up! There are so many basic things in math that non-mathematicians are confused about. I’m doing my best to keep up: if you’ve requested a “basics” topic and I haven’t gotten around to it, rest assured, I’m doing my best, and I will get to it eventually!
One of the things that multiple people have written to be about is confusion about what a mathematician means by a theory; and what the difference is between a theory and a theorem?
I’ve received a request from a long-time reader to write a basics post on modal logics. In particular, what is a modal logic, and why did Gödel believe that a proof for the existence of God was more compelling in modal logic than in standard predicate logic.
The first part is the easy one. Modal logics are logics that assign values to statements that go beyond “This statement is true” or “This statement is false”. Modal logics add the concepts of possibility and necessity. Modal logic allows statements like “It is necessary for X to be true”, “It is possible for X to be true”, etc.
In math and computer science, we have a tendency to talk about “going meta”. It’s actually a
pretty simple idea, which tends to crop up in other places, as well. It’s also one of my favorite concepts – the idea of going meta is just plain cool. (Not to mention useful. There’s a running joke among computer scientists that the solution to any problem is to add a level of indirection – which is programmer-speak for going meta on constructs inside of a programming language. Object-orientation is, in some sense, just an example of how to go meta on procedures. Haskell type-classes are an example of going meta on types.)
Going meta basically means taking a step back, and instead of talking about some subject X, you talk about talking about X.
Today’s basics topic was suggested to me by reading a crackpot rant sent to me by a reader. I’ll deal with said crackpot in a different post when I have time. But in the meantime, let’s take a look at axioms.
One thing that I frequently touch on casually as I’m writing this blog is the distinction between continuous mathematics, and discrete mathematics. As people who’ve been watching some of my mistakes in the topology posts can attest, I’m much more comfortable with discrete math than continuous.