Monthly Archives: February 2009

Stepping Up Divison By Zero to Perfect Encryption

An alert reader sent me a link to a really dreadful piece of drek. In some ways, it’s a
rehash of the “Nullity” nonsense from a couple of years ago, but with a new spin.

If you don’t remember nullity, it was the attempt of one idiot to define division by zero.
He claimed to have “solved” the great problem of dividing by zero, and by doing so, to be able
to do all manner of amazing things, such as to build better computers that would be less prone
to bugs.

Today’s garbage is in the
same vein: another guy, this one named Jeff Cook, who claims to have “solved” the problem of
division by zero. But this one claims that this gives him a way to prove the Reimann
hypothesis, to rapidly crack RSA public key encryption, and to devise a new “theoretically
unbreakable” encryption algorithm.

The grandiosity of this Mr. Cook is astonishing. He’s started a company (which is looking
for investors!); here’s a quote from his company’s homepage:

Great scientific discoveries mark the milestones of human history.

Such are the accomplishments achieved by the men and women of Singularics. Standing on the
shoulders of giants such as Albert Einstein and Bernhard Riemann, we have reached up through
nature’s veil and seen what lies hidden there more clearly than anyone else before us. Our
discoveries have yielded a new mathematical framework, one that provides a profound
understanding of nature’s basic mechanics. We have discovered The Science of
Singularics™
, the study of the singularity.

We have already found a variety of important applications of Singularic Technology™,
but perhaps the most immediately useful are Neutronic Encryption™, a new theoretically
unbreakable public key encryption algorithm and Singularic Power™, a new form of clean
power generation.

Neutronic Encryption, our next generation public key encryption algorithm, will play a
vital role in the digital age by ensuring that the electronic information of governments,
industry and individuals is kept secure and private in a world where cyber-terrorism is on the
rise.

We have also developed a new primary power generation system capable of delivering
abundant, clean and inexpensive energy that can satisfy power requirements on any scale.
Singularic Power production technology generates zero pollution and can therefore play an
instrumental role in promoting a harmonious coexistence between human civilization and the
Earth’s fragile ecosystem.

To date, our analysis of the mathematics and physics at the singularity has lead us to
eight important new inventions, most notably in the fields of information security and clean
energy. All eight inventions (patents pending), have significant and immediate application in
the global market.

It is our vision to use these advances to bring about great improvements for everyone
through new technology, intelligently applied.

Mr. Cook doesn’t have too high an opinion of himself, does he?

Continue reading

Gap Buffers, or, Don't Get Tied Up With Ropes?

Last week, I promised to continue my discussion of ropes. I’m going to
break that promise. But it’s in a good cause.

If you’re a decent engineer, one of the basic principles you should
follow is keeping this as simple as possible. One of the essential skills
for keeping things simple is realizing when you’re making something
overly complicated – even if it’s cool, and fun, and interesting.

Working out the rope code for more complex operations, I found myself
thinking that it really seemed complex for what I was doing. What I’m
doing is writing myself a text editor. I’m not writing a string
processing framework for managing gigabytes of data; I’m just writing a
text editor.

So I decided to try, for the purposes of testing, to look at one of
the simpler data structures. A classic structure for implementing editor
buffers is called a gap buffer. I’ll explain the details below.
The drawback of a gap buffer is that large cursor motions are relatively
expensive – the cost of moving the cursor is proportional to the distance
that you move it.

So I wrote a test. I implemented a basic gap buffer. Then I built a
test that did 300,000 inserts of five characters; then slid the cursor
from the end, to the beginning, back to the end, back to the beginning,
and back to the end again. Total execution time? 150 milliseconds. And
that includes resizing the buffer 12 times, because of a deliberately
stupid buffer sizing and copying strategy.

And with that, out the window went the ropes. Because in Java,
using a slapped together, sloppy piece of test code, which does things in a dumb
brute force way, it can do something much more complicated than anything an editor
will encounter in routine use, the gap buffer achieves performance that is more than
adequately fast. (Based on some experiments I did back at IBM, delays of 1/10th
of a second are roughly when people start to notice that an editor is slow. If you can respond is less than 1/10th of a second, people don’t perceive a troublesome delay.)
If the gap buffer can achieve the needed perfomance that easily, then there’s
absolutely no reason to do anything more complicated.

Continue reading

Financial Morons, and Quadratics vs. Linears

I wasn’t going to write about this, because I really don’t have much to add. But people keep mailing it to me, so in order to shut you all up, I’ll chip in.

As everyone knows by now, we’re in the midst of a really horrible
financial disaster. I’ve argued in the past on this blog that the root cause of the entire disaster is pure, simple stupidity on the part of people in the financial business. People gave out mortgages that any
sane rational person would have considered ridiculous. And then they built huge, elaborate financial structures on top of those mortgages, pretending that by somehow piling layer upon layer, loan upon loan, that
they were somehow creating something that could be considered real wealth.

bankcap6-1024x747.jpg

They gave themselves bonuses that boggled the mind. Even after the whole ridiculous system came tumbling down, they continue to give themselves ridiculous bonuses. Insane bonuses. They’ve been writing themselves checks for millions of dollars to continue to operate their
businesses – even after taking billions of dollars in loans from the government to prevent them from going out of business. I consider
this to be downright criminal. But even if it’s not criminal, it’s
incredibly stupid. The very people who ran those firms right to the edge of bankruptcy, who nearly took down our entire financial system
are being rewarded. Not only are they being allowed to continue
to rut the businesses that they pretty much destroyed, but they’ve
been paying themselves an astonishing amount of money to do it. And now they’re complaining bitterly about the fact that the government
wants to limit them to a paltry half-million dollars of salary per year.

They argue that they must be allowed to earn more than that. Because after all, the people who run those businesses are special. They’re “the best and the brightest”. They’re
extra-smart. No one else could possibly run those businesses. We can’t rely on anyone who’d accept a puny half-mil – they won’t be smart enough. They don’t have the special knowledge of the business that these people do.

There’s one minor problem with that argument: it doesn’t work. A couple of weeks ago, some idiot at JP Morgan circulated a chart that was supposed to summarize just how bad the financial disaster has been. The chart circulated for a couple of weeks – bounced from mailbox to mailbox, sent from one financial genius to another.

Only the chart was blatantly, obviously, trivially wrong, and anyone who had the slightest damned clue of the assets those businesses managed – i.e., the kind of thing that the idiot who drew the chart was supposed to know – should have been able to tell at a glance how wrong it was. But they didn’t. In fact, the damned thing didn’t stop circulating until (of all people) Bob Cringely
flamed it. Go look at the chart – it’s up at the top of this post.

Continue reading

It Never Stops: Another Silly Creationist Argument

As I’ve mentioned before, I get lots of email from all sorts of people. Lots of it is interesting, and lots of it is stupid. This morning, when I was checking my mail, I found an email from a creationist in my mailbox, which puts forth an “proof” against atheism that I hadn’t seen before. It’s about as idiotic as most creationist arguments are,
but it’s one that I’ve never seen before, and it’s interesting to shred it from the viewpoint of mathematical logic.

Here’s his argument:

Now to the main point, and somewhat more interesting stuff. I recently ran across a proof (perhaps not in the mathematical sense; I don’t know) against atheism. Atheism, both in the forms I’ve encountered and in logical necessity, requires that matter/energy is eternal, since the atheist would by necessity argue against any point at which matter/energy came into existence. Nothing does not create something. For anything to be eternal and be in the present, it must by necessity have come from eternity past. However, between eternity past and the present, there is an infinite amount of time. Therefore, traveling forward from eternity past, one could never reach the present day. Therefore, atheism cannot be true, and a definite point at which matter/energy came into existence is necessary.

What’s wrong with this? Where to begin?

Continue reading

Metric Abuse – aka Lying with Statistics

I’m behind the curve a bit here, but I’ve seen and heard a bunch of
people making really sleazy arguments about the current financial stimulus
package working its way through congress, and those arguments are a perfect
example of one of the classic ways of abusing statistics. I keep mentioning metric errors – this is another kind of metric error. The difference between this and some of the other examples that I’ve shown is that this is deliberately dishonest – that is, instead of accidentally using the wrong metric to get a wrong answer, in this case, we’ve got someone deliberately taking one metric, and pretending that it’s an entirely different metric in order to produce a desired result.

As I said, this case involves the current financial stimulus package that’s working its way through congress. I want to put politics aside here: when it comes to things like this financial stimulus, there’s plenty of room for disagreement.
Economic crises like the one we’re dealing with right now are really uncharted territory – they’re very rare, and the ones that we have records of have each had enough unique properties that we don’t have a very good collection of evidence
to use to draw solid conclusions about recoveries from them work. This isn’t like
physics, where we tend to have tons and tons of data from repeatable experiments; we’re looking at a realm where there are a lot of reasonable theories, and there isn’t enough evidence to say, conclusively, which (if any) of them is correct. There are multiple good-faith arguments that propose vastly different ways of trying
to dig us out of this disastrous hole that we’re currently stuck in.

Of course, it’s also possible to argue in bad faith, by
creating phony arguments. And that’s the subject of this post: a bad-faith
argument that presents real statistics in misleading ways.

Continue reading

Sloppy Arguments about Mutation Rates

My friend Razib, who is one of my
fellow ScienceBloggers sent me a
link to an interest attempt
by creationist at arguing why evolution can’t
possibly work.

I say interesting, because it’s at least a little bit unusual in
its approach. It’s not just the same old regurgitation of the same talking
points. It’s still not a great argument, but it’s at least not as boring as
staring at the same stupid arguments over and over again. Alas, it’s not entirely
new, either. It’s an argument that the mutation rates required for humans to have evolved from a primate ancestor would have to be impossibly high.

Continue reading