The First Computers and Their Evolution

Before the modern age, before the Information Age, even before the Industrial Age, human civilization made use of computers. But these early computers were nothing like the computers of today. In fact, these early computers were pretty much useless. But that's not to say they weren't useful for something. They were. They were useful for being the first computers.

But what exactly is a computer? A computer is a tool that helps us make sense of information. And in order to make sense of information, we need to be able to organize it. The first computers took their cue from the organization of the human brain. They were called "digital" computers, because they operated via binary digits or "bits."

These early computers took the form of improvised calculating devices, usually built inside a table or a cabinet. They consisted of a series of levers and knobs, with a series of sliding panels or drawers. The earliest computers had very little memory and were used to perform a series of mathematical calculations. Though these early computers were useless on their own, they laid the foundation for what we now take for granted in computers.

Here's a look at the early development of computers and how we got to where we are today.

The First Calculating Automata

Automata is a branch of mathematics that studies devices that can follow symbolic rules. The word comes from the Greek word for "machine." So, in a nutshell, automata are machines that can perform symbolic operations.

If you take an automata and give it a rule, it can follow that rule to perform specific actions. For example, the automata could move from left to right across a chessboard, taken one step at a time, in response to a set of knobs or dials that control its position.

The first automata were simple mechanical devices, made of gears and levers. These first calculating automata were invented in the Islamic world, and later, in Renaissance Italy.

The First Electronic Digital Computer

The first electronic digital computers were invented in the 1930s by Konrad Zuse, an engineer working in Germany.

These computers used relays and vacuum tubes to make electronic signals that represented numbers, logic gates to perform calculations, and memory to store data. The first fully functioning example of a Zuse machine was the Z3, built in 1944.

The First Programmable Computer

In the 1940s and 1950s, scientists and engineers in the United States, Britain, and Germany began to program computers for specific tasks.

These computers were custom-built for each application, using standardized components that were easier to obtain than the ones used by Zuse. The first programmable computer was the Ferranti Mark 1, which was built in Britain from 1952 to 1955.

The first programmable computer to be mass-produced was the IBM 650, which was built in 1956. The first commercially available programmable computer was the Ferranti Mark 1, which was built in Britain from 1952 to 1955.

The Second Generation of Computers

The second generation of computers was characterized by the use of vacuum tubes to construct full computer boards, with dozens or hundreds of components. The boards were built into cabinets or wall-mounted panels.

The size of second-generation computers was limited only by the amount of space that was available for the cabinets or panels. The largest produced second-generation computer in this period was the IBM 1609, which was built in 1952. It had more than 1,500 components and occupied more than 6,000 square feet of space.

Third Generation of Computers

The third generation of computers began with the advent of transistors in the mid-1950s. Transistors were invented to replace vacuum tubes, which had failed in all kinds of roles in the second generation of computers: as amplifiers, as switches, and as oscillators.

The first transistor-based computer was the IBM 7090, which was introduced in 1957. The transistor-based computers of this period were large and heavy, with as many as 10,000 transistors, and they were usually built into wall-mounted panels.

The architecture of these computers was similar to the second-generation Zuse and Ferranti machines, with logic gates operating on two or more levels.

Forth and Fifth Generations of Computers

The forth and fifth generations of computers saw the advent of integrated circuits, which are electronic components that combine several individual transistors onto a single chip. The first integrated-circuit-based computer was the Fairchild F8, which was introduced in 1961.

Integrated-circuit technology greatly reduced the size, weight, and cost of computers, making it feasible for the first time to put computers into automobiles and other mobile devices.

A typical integrated-circuit computer of this period might have consisted of more than 100 interconnected chips, with as many as 50,000 different electronic configurations possible.

Sixth Generation of Computers

The sixth generation of computers saw the advent of programmable logic devices, or PLDs. PLDs are like integrated circuits, but they're designed to be used with software written in a high-level language.

For example, an FPGA (field programmable gate array) is a device that can be programmed to perform a wide variety of functions using software written in a high-level language.

FPGAs are great for creating custom chips that can perform a variety of functions. The first programmable logic device was the PLM (programmable logic module) invented by Signetics in the 1970s.