Category Archives: Basics

Not Quite Basics: Sorting Algorithms

Multiple people have written to me, after seeing yesterday’s algorithms basics post, asking me to say more about sorting algorithms. I have to say that it’s not my favorite topic – sorting is one of those old bugaboos that you can’t avoid, but which gets really dull after a while. But there is a kernel of interest to it – sorting can be used to demonstrate a lot of interesting ideas about
computational complexity.

Continue reading

Basics: Algorithm

A kind reader pointed out that I frequently mention algorithms, but that I haven’t defined them in the basics posts. To me, they’re so fundamental to the stuff I do for a living that I completely forgot that they’re an interesting idea.

It’s really pretty simple: an algorithm is description of a mechanical procedure for performing a mathematical task. But that’s actually a fairly sophisticated idea – the general concept of things that describe a procedure, and which can be discussed, described, and reasoned about as entities in their own right is something quite different from nearly anything that came before it.

Continue reading

Basics: Optimization

Yet another term that we frequently hear, but which is often not properly understood, is the concept of optimization. What is optimization? And how does it work?

The idea of optimization is quite simple. You have some complex situation, where
some variable of interest (called the target) is based on a complex
relationship with some other variables. Optimization is the process of trying to find
an assignment of values to the other variables (called parameters) that produces a maximum or minimum value of the target variable, called
the optimum or optimal value

The practice of optimization is quite a bit harder. It depends greatly
on the relationships between the other variables. The general process of finding
an optimum is called programming – not like programming a computer; the term predates the invention of computers.

Continue reading

Not quite Basics: The Logician's Idea of Calculus

In yesterdays basics post, I alluded to the second kind of calculus – the thing that computer scientists like me call a calculus. Multiple people have asked me to explain what our kind of calculus is.

In the worlds of computer science and logic, calculus isn’t a particular thing:
it’s a kind of thing. A calculus is a sort of a logician’s automaton: a purely
symbolic system where there is a set of rules about how to perform transformations of
any value string of symbols. The classic example is lambda calculus,
which I’ve written about before, but there are numerous other calculi.

Continue reading

Basics: Calculus

Calculus is one of the things that’s considered terrifying by most people. In fact, I’m sure a lot of people will consider me insane for trying to write a “basics” post about something like calculus. But I’m not going to try to teach you calculus – I’m just going to try to explain very roughly what it means and what it’s for.

There are actually two different things that we call calculus – but most people are only aware of one of them. There’s the standard pairing of differential and integral calculus; and then there’s what we computer science geeks call a calculus. In this post, I’m only going to talk about the standard one; the computer science kind of calculus I’ll write about some other time.

Continue reading

Basics: Limits

One of the fundamental branches of modern math – differential and integral calculus – is based on the concept of limits. In some ways, limits are a very intuitive concept – but the formalism of limits can be extremely confusing to many people.

Limits are basically a tool that allows us to get a handle on certain kinds
of equations or series that involve some kind of infinity, or some kind of value that is almost defined. The informal idea is very simple; the formalism is also pretty simple, but it’s often obscured by so much jargon that it’s hard to relate it to the intuition.

Continue reading

Basics: Algebra

Basics: Algebra

While I was writing the vectors post, when I commented about how math geeks always build algebras around things, I realized that I hadn’t yet written a basics post explaining what we mean by algebra. And since it isn’t really what most people think it is, it’s definitely worth taking the time to look at.

Algebra is the mathematical study of a particular kind of structure: a structure created by taking a set of (usually numeric) values, and combining it with some operations operate on values of the set.

Continue reading

Basics: Vectors, the Other Dimensional Number

There’s another way of working with number-like things that have multiple dimensions in math, which is very different from the complex number family: vectors. Vectors are much more intuitive to most people than the the complex numbers, which are built using the problematic number i.

A vector is a simple thing: it’s a number with a direction. A car can be going 20mph north – 20mph north is a vector quantity. A 1 kilogram object experiences a force of 9.8 newtons straight down – 9.8n down is a vector quantity.

Continue reading

Basics: Multidimensional Numbers

When we think of numbers, our intuitive sense is to think of them in terms of
quantity: counting, measuring, or comparing quantities. And that’s a good intuition for real numbers. But when you start working with more advanced math,
you find out that those numbers – the real numbers – are just a part of the picture. There’s more to numbers than just quantity.

As soon as you start doing things like algebra, you start to realize that
there’s more to numbers than just the reals. The reals are limited – they exist
in one dimension. And that just isn’t enough.

Continue reading

Basics: The Halting Problem

Many people would probably say that things like computability and the halting
program aren’t basics. But I disagree: many of our basic intuitions about numbers and
the things that we can do with them are actually deeply connected with the limits of
computation. This connection of intuition with computation is an extremely important
one, and so I think people should have at least a passing familiarity with it.

In addition to that, one of the recent trends in crappy arguments from creationists is to try to invoke ideas about computation in misleading ways – but if you’re familiar with what the terms they’re using really mean, you can see right through their
silly arguments.

And finally, it really isn’t that difficult to understand the basic idea.
Really getting it in all of its details is a bit harder, but just the basic idea that there are limits to computation, and to get a sense of just how amazingly common uncomputable things are, you don’t need to really understand the depths of it.

Continue reading