Bunch-Hellemans, The first working computers

Defense needs in World War II were the driving force behind the development of the first large electronic computer built in the United States. The Electronic Numerical Integrator and Computer (ENIAC), although completed at the end of 1945, after World War II had ended, was initially designed to calculate trajectories of projectiles. Developed by the University of Pennsylvania’s Moore School of Electrical Engineering for the Ballistic Research Laboratory in Aberdeen, Maryland, it was a general-purpose decimal machine containing 18,000 vacuum tubes. Its design was related to the Differential Analyzer built by Vannevar Bush, except that Bush’s mechanical components, such as counters and adders, were replaced by electronic ones.

Logic was integrated into the machine’s hardware, and by changing the setup of the hardware, ENIAC could be adjusted to perform different tasks. Such “programming” consisted of plugging cables into plug boards and setting hundreds of switches.

Many of the vacuum tube circuits in ENIAC were derived from those used in subatomic particle counters. The computer had a very small programmable memory consisting of flip-flop circuits that could store 20 words along with several permanently wired memory circuits called function tables, which could be modified for specific calculations.

Because ENIAC used vacuum tubes instead of relays, it was about a thousand times faster in performing calculations than contemporary electromechanical machines. For example, ENIAC could calculate the trajectory of a shell in 20 seconds, faster than the 30 seconds or so that a real shell takes to reach its target.

Even while ENIAC was being completed at the Moore School, its designers, John Mauchly, Presper Eckert, and Herman Goldstine, were aware of its limitations and started work with mathematician John von Neumann on an entirely new design of computer. The basic concept, now known as von Neumann architecture, separates logic functions entirely from hardware; that is, most instructions for the execution of calculations are not permanent, called hardwired or read-only memory (ROM), but are stored in a temporary memory called random-access memory (RAM). Instructions can be placed anywhere in memory and even modified by the computer itself when needed. Besides a random-access read/write memory, a von Neumann computer contains a central processor and uses binary numbers and Boolean algebra (a form of symbolic logic) for processing and storing data.

Unlike ENIAC, which in some ways was a parallel processor (working on different aspects of a task at the same time, or in parallel), computers with von Neumann architecture process data serially; that is, one instruction comes after another and is executed only when the preceding one is completed. Von Neumann architecture was implemented in virtually all subsequent computers for the next quarter century. Such vacuum tube computers based on von Neumann architecture are now known as the first generation of computers. The second generation was born with the incorporation of the transistor in computer circuitry.

In 1944 von Neumann published his ideas and those of his colleagues working on ENIAC in the First Draft of a Report on the EDVAC. Von Neumann’s ideas would have been first implemented on a planned computer to be called EDVAC (Electronic Discrete VAriable Computer). Because of patenting problems, its completion became delayed, however, and other computers became the first to use von Neumann architecture.

The EDVAC design called for a memory that would be able to store at least 1024 32-bit words (“words” in the binary numeration system consisting of 32 zeros and ones, with each binary digit (0 or 1) known as a bit). The construction of such a memory was at that time somewhat of a technological hurdle.

One type of available memory derived from the mercury delay line, a device developed for early radar to measure the time between the emission and the reception of the reflection of a radar signal. The delay line consisted of a metal tube filled with mercury. At each end of the tube a transducer was mounted. One end operated as a tiny speaker for emitting sound pulses and the other served as a microphone. A series of sound pulses could be permanently stored in such a tube by amplifying the signals picked up at one end and feeding them back at the other; the signals would travel around as in a merry-go-round. Such a delay line could store up to a thousand bits as pulses and constituted the memory of several of the early first-generation machines.

Another type of memory was derived from television technology. Information was stored on the inside of a special type of cathode-ray tube as tiny, electrically charged spots, and then read by another electron beam.

Both of these types of memory were unwieldy and not entirely reliable. They were soon replaced by ferrite-core memories that not only equipped the later first-generation computers, but also were used by the second-generation computers during the 1950s and 1960s. Ferrite-core memories use magnetic domains for recording information, a technology that, in different forms, is still the most common today. Ferrite-core ROM was also used in first-generation computers, but that was replaced by semiconductor technology in the second generation.

Related

Facebook Comments