A History of Computing

by Hugh Bien — 10/01/2014

Less than two months from today, The Imitation Game is set to release in the US! I'm excited to see Benedict Cumberbatch — of Sherlock fame — play Alan Turing. This got me thinking about the history of computers and wondering about the first computer. This led to a fun evening of reading Wikipedia.

Here's a brief history of computing.

Charles Babbage's Analytical Engine

1822 — Charles Babbage proposed an idea to the Royal Astronomical Society of London, what if he could automate a computer's job?

During this time, a "computer" was a person who computed mathematical calculations. Teams were assembled for tedious work. The results were placed into a huge data table, for later usage. The earliest known computers calculated data for dealing with celestial objects — e.g. timing Halley's Comet. But it wasn't a prestigious job. It was the opposite.

The human computer is supposed to be following fixed rules; he has no authority to deviate from them in any detail. — Alan Turing

Babbage's idea intrigued the British government, who also employed human computers. Labor was expensive and error prone. An automated device would solve both problems. The government funded Babbage's idea and he began work immediately on the Difference Engine.

Babbage designed the Difference Engine to be a mechanical device that tabulated data for polynomial functions, e.g. y = x2 + 4x + 2. Mechanical columns would be used as input, which you could rotate to be any decimal number. A hand crank would spin the gears to produce an output.

While designing his Difference Engine, he realized that a more general purpose machine could be made. Instead of inputting only numbers as data, he could input both instructions and data. The instructions could be delivered via punch cards — a technique taken from mechanical looms of the time. He called this second machine the Analytical Engine, which is widely regarded as the beginning of computers as we see them now.

Although designed, Babbage was never able to complete construction of his two engines. The machinery was too complicated for its time and funding was eventually withdrawn.

Ada Lovelace's Algorithm

1833 — Ada Lovelace is introduced to Charles Babbage through a mutual friend. He shows his work and they quickly become friends.

Lovelace was a mathematician and logician. In 1842, Lovelace was hired to translate a French paper, a transcript of a seminar given by Babbage, to English. She included her own notes with the translation.

The Analytical Engine might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations... — Ada Lovelace

Included in her notes were instructions for the Analytical Engine. It was an algorithm for computing the Bernoulli numbers — the first algorithm specifically made for a computer, or the first computer program.

Through her work as a mathematician, Ada Lovelace becomes the world's first computer programmer.

Claude Shannon's Digital Circuits

1937 — Claude Shannon finishes his master's thesis at MIT. His paper, A Symbolic Analysis of Relay and Switching Circuits, introduced the world to digital circuits.

As an undergraduate, Shannon took a course on boolean algebra — a branch of logic that dealt with the values true/false and operations on those values. If you're a software programmer, you're all too familiar with boolean logic. We use it everyday. If you're not a software programmer, you've probably heard about how computers work with binary — ones and zeros. Those ones/zeros can also represent true/false.

Shannon showed how you could arrange electrical switches to perform boolean algebra, the underlaying concept of digital circuits. It later turned out that using digital circuits to perform math was much more reliable than the mechanical/analog approach.

Konrad Zuse's Z3

1938 — Konrad Zuse had been designing and building his Z1 machine for the past 3 years in Berlin. Like Babbage's engines, input and output were specified using decimal numbers. Unlike Babbage's design, the input was translated to binary before calculations were performed.

Unfortunately, the Z1 was unreliable when used. Its mechanical parts required precise synchronization, which turned out to be unpredictable. Zuse uses his newly found knowledge to create successors to his machine.

In 1941, Zuse finishes construction of the Z3. Whereas the Analytical Engine was never completed, the Z3 was the world's first working programmable binary computer.

In 1943, Zuse's machines would be destroyed during the Allied air raids.

Alan Turing's Turing Machine

1939 — exactly one day after the United Kingdom entered World War II, Alan Turing steps foot onto Bletchley Park. He begins his cryptography work to decode Germany's encrypted messages.

After World War I, Germany began encoding secret messages using a mechanical device called the Enigma machine. It was a electro-mechanical device with a keyboard for input. The Enigma's appearance resembled a typewriter in a briefcase.

Turing and his team quickly developed a machine to decrypt German messages, which they called the Bombe. By the end of World War II, over 200 Bombes were constructed. Winston Churchill credited Turing as the single largest contributor to the Allies' victory. But his work during the war wasn't his only contribution. Perhaps even a larger contribution was the Turing Machine.

Let's rewind a few years to 1936. Turing publishes his paper, On Computable Numbers, with an Application to the Entscheidungsproblem, that introduces the world to an "a-machine" (it wasn't renamed to Turing machine until years later).

The Turing machine is a concept, not a physical reality. It consists of:

What Turing describes isn't actually a computer but the act of computation itself. The concept is so important in computer science that the term "Turing complete" was created. Any system is Turing complete if it can simulate a Turing machine. For example, a programming language is considered complete if it can manipulate memory and has conditional logic.

Before the World War II, computers were constructed for a single purpose. But after the war, efforts switched to building a Turing complete, general purpose computer. Turing himself attempted to build one in 1945, but it was never finished.

The United States Army's ENIAC

1947 — the ENIAC was switched on for the first time. It was the first electronic, Turing complete computer. It took 3 years and $500,000+ to construct.

The ENIAC, or Electronic Numerical Integrator And Computer, was primarily designed to calculate mathematical tables used in military artillery. It was built using thousands of vacuum tubes and crystal diodes. Punch cards were used for input and output.

Programming the ENIAC was incredibly complex compared to today's standards. After creating an algorithm, a team of engineers would have to translate the instructions by flipping switches and re-routing cables on the machine. The process usually took days.

The ENIAC operated for 8 years until October 2, 1955.

So... Which Was the First Computer?

It depends on how you define a computer.

Charles Babbage early work on the Analytical Engine could be considered the first computer, except it was never completed. If you're ever in Mountain View, I highly recommend visiting the Computer History Museum where they finished construction of Babbage's Difference Engine.

Konrad Zuse's Z3 is often cited as the world's first computer. It was fully operational and programmable. Except it wasn't fully electronic and wasn't designed to be Turing complete. Later on, in 1998, it was proved to be Turing complete using a few hacks.

The ENIAC could also be considered the first computer. It was functional, electronic, and Turing complete. What more could you ask for?

But if anyone asks which was the first computer, just tell them it was the abacus.

––––

Follow me via , RSS feed, or Twitter.

You may also enjoy:
Raise and Rescue · Thyme Pomodoro Timer · SEO Tactics · All Articles →