# Basics: Real Numbers

What are the real numbers?

Before I go into detail, I need to say up front that I hate the term
real number. It implies that other kinds of numbers are not real,
which is silly, annoying, and frustrating. But we’re pretty much stuck with it.

There are a couple of ways of describing the real numbers. I’m going to take you through a couple of them: first, an informal intuitive description; then an axiomatic definition, and finally, a constructive definition.

# Basics: The Turing Machine (with an interpreter!)

As long as I’m doing all of these basics posts, I thought it would be worth
explaining just what a Turing machine is. I frequently talk about things
being Turing equivalent, and about effective computing systems, and similar things, which all assume you have some clue of what a Turing machine is. And as a bonus, I’m also going to give you a nifty little piece of Haskell source code that’s a very basic Turing machine interpreter. (It’s for a future entry in the Haskell posts, and it’s not entirely finished, but it does work!)

The Turing machine is a very simple kind of theoretical computing device. In
fact, it’s almost downright trivial. But according to everything that we know and understand about computation, this trivial device is capable of any computation that can be performed by any other computing device.

The basic idea of the Turing machine is very simple. It’s a machine that runs on
top of a tape, which is made up of a long series of little cells, each of which has a single character written on it. The machine is a read/write head that moves over the tape, and which can store a little bit of information. Each step, the
machine looks at the symbol on the cell under the tape head, and based on what
it sees there, and whatever little bit of information it has stored, it decides what to do. The things that it can do are change the information it has store, write a new symbol onto the current tape cell, and move one cell left or right.

That’s really it. People who like to make computing sound impressive often have
very complicated explanations of it – but really, that’s all there is to it. The point of it was to be simple – and simple it certainly is. And yet, it can do
anything that’s computable.

# Basics: Sets

Sets are truly amazing things. In the history of mathematics, they’re
a remarkably recent invention – and yet, they’re now considered to be the
fundamental basis on which virtually all of mathematics is built. From simple things (like the natural numbers), to the most abstract and esoteric things (like algebras, or topologies, or categories), in modern math, they’re pretty much all understood
in terms of sets.

# Basics: Syntax and Semantics

Another great basics topic, which came up in the comments from last fridays “logic” post, is the
difference between syntax and semantics. This is an important distinction, made in logic, math, and
computer science.

The short version of it is: syntax is what a language looks like; semantics is what
a language means. It’s basically the distinction between numerals (syntax) and
numbers (semantics).

# Basics: Logic, aka "It's illogical to call Mr. Spock logical"

This is another great basics topic, and it’s also one of my pet peeves. In general, I’m a big science fiction fan, and I grew up in a house where every saturday at 6pm, we all gathered in front of the TV to watch Star Trek. But one thing which Star Trek contributed to our vocabulary, for which I will never forgive Gene Rodenberry, is “Logic”. As in, Mr. Spock saying “But that would not be logical.”.

The reason that this bugs me so much is because it’s taught a huge number of people that “logical” means the same thing as “reasonable”. Almost every time I hear anyone say that something is logical, they don’t mean that it’s logical – in fact, they mean something almost exactly opposite – that it seems correct based on intuition and common sense.

If you’re being strict about the definition, then saying that something is logical by itself is an almost meaningless statement. Because what it means for some statement to be “logical” is really that that statement is inferable from a set of axioms in some formal reasoning system. If you don’t know what formal system, and you don’t know what axioms, then the statement that something is logical is absolutely meaningless. And even if you do know what system and what axioms you’re talking about, the things that people often call “logical” are not things that are actually inferable from the axioms.

# Basics: Correlation

Correlation and Causation

Yet another of the most abused mathematical concepts is the concept of correlation, along with the related (but different) concept of causation.

Correlation is actually a remarkably simple concept, which makes it all the more frustrating
to see the nonsense constantly spewed in talking about it. Correlation is a linear relationship between two random variables.

# Basics: Recursion and Induction

Time for another sort-of advanced basic. I used some recursive definitions in my explanation
of natural numbers and integers. Recursion is a very fundamental concept, but one which many people have a very hard time wrapping their head around. So it’s worth taking the time to look at it, and see what it means and how it works.

The cleverest definition that I’ve seen of recursion comes from the Hackers dictionary. In there, it has:

```recursion
n. See {recursion}.
```

# Basics: Natural Numbers and Integers

One of the interestingly odd things about how people understand math is numbers. It’s
astonishing to see how many people don’t really understand what numbers are, or what different kinds of numbers there are. It’s particularly amazing to listen to people arguing
vehemently about whether certain kinds of numbers are really “real” or not.

Today I’m going to talk about two of the most basic kind of numbers: the naturals and the integers. This is sort of an advanced basics article; to explain things like natural numbers and integers, you can either write two boring sentences, or you can go a bit more formal. The formal
stuff is more fun. If you don’t want to bother with that, here are the two boring sentences:

1. The natural numbers (written N) are zero and the numbers that can be
written without fractions that are greater than zero.
2. The integers (written Z) are all of the numbers, both larger and smaller than
zero, that can be written without fractions.

# Basics: Margin of Error

The margin of error is the most widely misunderstood and misleading concept in statistics. It’s positively frightening to people who actually understand what it means to see how it’s commonly used in the media, in conversation, sometimes even by other scientists!

The basic idea of it is very simple. Most of the time when we’re doing statistics, we’re doing statistics based on a sample – that is, the entire population we’re interested in is difficult to study; so what we try to do is pick a representative subset called a sample. If the subset is truly representative, then the statistics you generate using information gathered from the sample will be the same as information gathered from the population as a whole.

But life is never simple. We never have perfectly representative samples; in fact, it’s impossible to select a perfectly representative sample. So we do our best to pick good samples, and we use probability theory to work out a predication of how confident we can be that the statistics from our sample are representative of the entire population. That’s basically what the margin of error represents: how well we think that the selected sample will allow us to predict things about the entire population.

# Basics: Standard Deviation

When we look at a the data for a population+ often the first thing we do
is look at the mean. But even if we know that the distribution
is perfectly normal, the mean isn’t enough to tell us what we know to understand what the mean is telling us about the population. We also need
to know something about how the data is spread out around the mean – that is, how wide the bell curve is around the mean.

There’s a basic measure that tells us that: it’s called the standard deviation. The standard deviation describes the spread of the data,
and is the basis for how we compute things like the degree of certainty,
the margin of error, etc.