The History of Computers

This page is regularly updated to reflect any significant changes in computer history.

Computers truly became a revolutionized invention during the last part of the 20th century. They have already helped win wars, solve seemingly insoluble problems, saved lives and launched us into space. Their history spans more than 2500 years to the creation of the abacus. This resource provides a brief history of computers.

history of computers

The difference between the abacus and a modern day computer is massive, the abacus needs a human operator, whereby a computer doesn’t need constant input because it counts using binary code. However, there are similarities between the two. They both make repeated calculations. This resource seeks to discover the in-depth history of computers and how far we’ve come since the first computer.

The Abacus

The abacus (abaci), also called the counting frame, was the earliest aid of mathematical computations. Speculations suggest that it originated in the Middle East in 500 BC but its exact origin is still unknown. Nevertheless, it remained the fastest invention to perform calculations until the middle of the 17th century.

A modern abacus, which is sometimes used in the Far East, consists of rings sliding over rods. However, the original ones were pebbles. The Latin word for pebble is calculus, which is where the term calculator first originated.

the abacus was the first computer
A modern day Abacus being used by a child.
Image source: Adobe Stock

John Napier & Napier's Bones

In the early 17th century, John Napier, a Scottish mathematician, invented another calculating tool called Napier's Bones. It used marked strips of wood or bone (side by side) to multiply and divide.

The Pascaline

In 1642, an eighteen year old French scientist, writer and philosopher, Blaise Pascal (1623–1662), invented the very first mechanical calculator. It was named the Pascaline.

It had been created as an aid for Pascal’s father, who was a tax collector. Pascal built fifty of these gear driven calculating machines and couldn’t manage to sell many due to their expensive cost.

Also, because they could only add and subtract numbers plus they weren’t totally accurate.

the pascaline or pascal's computer
A Pascaline, signed by Pascal in 1652.
Image source: Wikipedia

Wilhelm Gottfried Leibniz and The Stepped Reckoner

A number of decades later, a German philosopher and mathematician named Wilhelm Gottfried Leibniz (1646–1716) had an idea for a similar machine to Pascal’s, but this one was to be much more advanced. It had a stepped drum, instead of the use of cogs and it had other benefits like adding and subtracting, division, multiplication, as well as having the ability to figure out square roots.

In 1673, German inventor Gottfried Liebniz perfected his Liebniz Calculator, also referred to as The Stepped Reckoner. Bookkeepers and mathematicians used it.

Replica of Leibniz's stepped reckoner in the Deutsches Museum

Replica of Leibniz's stepped reckoner in the Deutsches Museum.
Image source: Wikipedia

The Binary Code

Leibniz is also remembered for his contribution to computers because he was the man responsible for inventing the binary code. He never made use of the binary code but it made other scientists and inventors think about the various ways in which it could be used. Later, Leibniz’s binary code of ones and zeros would be used in advanced modern computers, including becoming what computers use to make calculations.

The Boolean Logic

In 1854, an Englishman named George Boole (1815–1864) made a revolutionary discovery. He discovered a concept known as Boolean Logic, a radical and simple idea. There isn’t a piece of technology around today that doesn’t rely on Boolean Logic.

Incorporated with binary code, it allows computers to make simple choices like comparing ones and zeros. It’s essentially an on/off switch, so if your phone is one and an off is zero, that's all you need. Boole was one of the first people to propose that the way we think is by using logic.

The Analytical Engine

The central figures of 19th century computing were the mathematicians and computer pioneers. Charles Babbage (1791–1871) and Ada Lovelace (1815–1852), the daughter of British poet Lord Byron.

Charles Babbage is said to have pioneered the modern computer age with his differencing engines, which mechanized arithmetic and his analytical engine, which enabled automatic computing. Hence, the creation of other automatic computing.

However, his designs had limited influence on succeeding generations. Charles, together with close friend Ada Lovelace, produced a clear commentary of the potential of The Analytical Engine. This was an introduction to what we now call programming.

The Beginning of What We Now Know As IBM

Toward the latter half of the 19th century, there were other inventors who envisioned bigger and better calculating engines. In fact, they had more success than Babbage. One such man was American inventor Herman Hollerith (1860–1929) who developed an electrometrical machine named the Tabulator that helped compile the census.

Throughout the 1880s, the population of the United States of America had increased so much that data collected by hand was taking as much as seven and a half years. It was thought that if this growth carried on, they wouldn’t have compiled the last census before the next one began.

The Hollerith Tabulator was a roaring success and it only took the machine six weeks to complete the census, including the full analysis in only three years. The United States government saved five million dollars.

In 1896 Hollerith founded his own company, the Tabulating Machine Company. Then a few years later, the name changed to Computing-Tabulating-Recording and then again in 1924 to its present name IBM, which stands for International Business Machines.

The Differential Analyzer

As IBM was taking off, a young scientist and graduate student of the Massachusetts Institute of Technology Vannevar Bush (1890–1974), already the inventor of a new surveyor’s tool, earned his doctorate in electrical engineering in a single year.

It was the start of an astonishing career. Bush played a role in the rise of radio, the building of the atomic bomb and the beginning of the digital age.

During his teaching position at MIT, he kept up his inventions. Bush refined the S-tube even though it wasn't invented by him, turning the new technology of home radio into a simple plug-in device.

Later in the 1930s, Bush developed a room-filling machine called the Differential Analyzer, a mechanical computer that Bush said represented the ability to think straight. It was an outstanding machine but it wasn’t to be the only key player in the history of computing.

Woman using the differential analyzer
A differential analyzer at the NACA Lewis Flight Propulsion Laboratory, 1951
Image source: Wikipedia

The Theory of Digital Communication and Storage

A significant movement in the history of computers

In 1948 Claude Shannon (1916–2001) created a challenge for engineers and paved the way for the compact disk, the fax machine and mp3 files with the basic theory of digital communications and storage.

Shannon called it Information Theory. Then fifty years later, the engineers solved the challenge and created other aspects of digital technology. The whole idea of digitizing things, along with the fact that you can store them, download them and upload them, comes from Shannon’s landmark work.

Shannon also contributed to the early development of integrated circuits, computers, cryptography and genomics. Today not many people in the world know of Claude Shannon but he is by far one of the most influential figures of the 20th century.

The Imagined Turing Machine Model

Mathematics in its purest form is a solution to many problems. One particular mathematician, Alan Turing (1912-1954) was perhaps one of the greatest mathematical minds of the 20th century.

He was also a cryptographer and pioneer of computer science. Turing is known today for his part in breaking the German enigma code during World War II and by then he was already well established as a mathematician of extraordinary abilities.

Turing decided to take on a groundbreaking thesis, the concede (context of admitting, validating or to stop resisting something) of a hypothetical machine that would read symbols on a strip of tape, rewriting or deleting them based on a finite set of rules. Turing described persons performing these operations as the computer.

At the age of 16, Alan Turing
Alan Turing at age 16.
Image source: Wikipedia

When this machine was given a problem to compute, it would either stop and give you the answer or carry on forever if the answer didn’t exist.

He mathematically proved that you never really know if and when the machine will stop. He created definitive examples of the undecidable problem and that areas in mathematics will always remain a barrier to the complete truth. The imagined Turing machine model went on to be one of the most influential mathematical abstractions of computer science.

The Harvard Mark I

In August 1944, IBM introduced the automatic sequence controlled calculator also known as the ASCC, the largest electromechanical calculator in the entire world, designed by Howard Aiken (1900-1973).

It was a general-purpose computer that was used during the later half of World War II. Aiken presented the concept to IBM on November 1937. Thomas Watson senior approved the project and it’s funding in February 1939.

It was officially presented to Harvard University on the 7th of August 1944. The automatic sequence controlled calculator then became known as Harvard Mark I by the staff at the university.

The Harvard Mark I was an enormous giant of a machine, it was 51 feet long, used 500 miles of wire with well over 3 million connections and thousands of switches.

The computer programmers of the Harvard Mark I were Richard Milton Bloch (1921- 2000), Robert Campbell and Grace Murray Hopper (1906-1992). The Harvard Mark I worked at such a high speed that in May 1940 it was used to make mathematical computations, alongside the Manhattan projects atomic research.

After 15 years in service, the Harvard Mark I was officially retired and taken offline in 1959. Some portions of the Harvard Mark I still remain today and can be found at Harvard University.

input and output details of the Harvard Mark I
Input/output and control readers closeup. Image source: Wikipedia.
Left end of the Harvard Mark I
"The left end consisted of electromechanical computing components."
Image source: Wikipedia.
Right end of the Harvard Mark I
"The right end included data and program readers, and automatic typewriters."
Image source: Wikipedia.

Steve Wozniak & Steve Jobs: The Start of Apple

Today’s leading men of inventors, engineers and entrepreneurial designers are known all around the world for their contribution to the personal home computer in the 21st century.

They have launched some amazing products and with businesses worth millions of dollars, they have given us a technology that the likes of Charles Babbage could only have dreamt of. The leaders in the world of computing today are Steve Wozniak (Born August 11, 1950 - age 66), Steve Jobs (1955-2011) and Bill Gates (Born October 28, 1955 - age 61).

Steve Wozniak started computing in very humble beginnings. He had absolutely no money and his friend Steve Jobs was helping him along the way. Wozniak had the idea to build a PC board and Steve Jobs persuaded his friend to start a company.

That was the premise in which they co-founded Apple Computer Corporation. Then they got lucky when a local store wanted to buy the computers, fully functional with all the parts.

They offered to pay them $50,000 for an order of their computer and after that, they were in business. However, this was a short-term product that they were selling and they knew they needed something better if they were to make an impact on the world of computing.

The Apple I lacked the basic features that we take for granted today such as a keyboard and monitor. So the Apple 2, complete with all features was the product that they knew would change the computing world and they didn’t want to just give it away, so it became their next project.

Wozniak and Job’s success with the Apple computer shocked IBM who absolutely dominated the industry at the time. So a year later they came up with a project that would rival their opposition, the IBM Personal Computer, known in short as the PC and based on an Intel 8080 microprocessor.

Steve Jobs co-founder of Apple
Image source: Wikipedia
Steve Wozniak co-founder of Apple
Image source: Wikipedia

Meanwhile a young 17 year old Bill Gates had started his company Microsoft, which was to be one of the greatest computing companies in the world. The Microsoft Company stayed his primary focus until he was 53. He then made the transition to the Bill Gates Foundation.

When Gates started Microsoft, he was still in high school, eager and driven to be involved with computers. This was considered a fairly special time because computers were extremely expensive to purchase and Bill along with his friend Paul Allen, used the University's computers at night.

They were both fascinated about what the technology could do. They eventually had the idea of moving the computer onto a chip that Intel would later make and would in consequence make computers much cheaper than the ones they had been using at the University.

This would make computers more powerful and available to more people. The big moment came when Gates decided to work for a company writing software but companies ended up coming to him for the software instead.

Bill Gates and the history of computers
Image source: Future of Humanity Institute, University of Oxford.

Microsoft famously won the contract to produce IBM’s operating system and in effect let Bill Gate’s company license it, not buy. This was before the user graphical interface, when there was still just text on the screen. The software MS-DOS, was a critical thing.

However, IBM didn’t see the values in the software. They thought the hardware was the key and the software was just necessary. Microsoft on the other hand realized that over time, the software would be a lot more important than the hardware. If they had realized that in the beginning, Bill Gates would have been given a different deal entirely.

Now as new companies appear in technology, we can be safe in the knowledge that the old ones will stick around to explore more powerful ways to prosper the computing world.