Monday, Feb. 20, 1978
Science: The Numbers Game
From a roomful of knitting ladies to a superchilled "brain "
For the young electronics engineer at the newly formed Intel Corp., it was a challenging assignment. Fresh out of Stanford University, where he had been a research as sociate, M.E. ("Ted") Hoff in 1969 was placed in charge of producing a set of miniature components for programmable desk top calculators that a Japanese firm planned to market. After studying the circuitry proposed by the Japanese designers, the shy, self-effacing Hoff knew that he had a problem. As he recalls: "The calculators required a large number of chips, all of them quite expensive, and it looked, quite frankly, as if it would tax all our design capability."
Pondering the difficulty, Hoff was suddenly struck by a novel idea. Why not place most of the calculator's arithmetic and logic circuitry on one chip of silicon, leaving mainly input-output and programming units on separate chips? It was a daring conceptual move. After wrestling with the design, Hoff and his associates at Intel finally concentrated nearly all the elements of a central processing unit (CPU), the computer's electronic heart and soul, on a single silicon chip.
Unveiled in 1971, the one-chip CPU -- or microprocessor -- contained 2,250 transistors in an area barely a sixth of an inch long and an eighth of an inch wide. In computational power, the micro processor almost matched the monstrous ENIAC -- the first fully electronic computer, completed in 1946 -- and performed as well as an early 1960s IBM machine that cost $30,000 and required a CPU that alone was the size of a large desk. On his office wall, Hoff still displays Intel's original advertisement: "Announcing a new era of integrated electronics ... a microprogrammable computer on a chip."
Intel's little chip had repercussions far beyond the pocket-calculator and minicomputer field. It was so small and cheap that it could be easily incorporated into almost any device that might benefit from some "thinking" power: electric typewriters with a memory, cameras, elevator controls, a shopkeeper's scales, vending machines, and a huge variety of household appliances. The new chip also represented another kind of breakthrough: because its program was on a different chip, the microprocessor could be "taught" to do any number of chores. All that had to be done was to substitute a tiny program chip with fresh instructions. In a memorable display of this versatility, the Pro-Log Corp. of Monterey, Calif., built what was basically a digital clock. But by switching memory chips and hitching it to a loudspeaker, it became first a "phonograph," playing the theme from The Sting, then an electric piano.
The Intel chip and one developed at about the same time at Texas Instruments--the question of priority is still widely debated in the industry--were the natural culmination of a revolution in electronics that began in 1948 with Bell Telephone Laboratories' announcement of the transistor. Small, extremely reliable, and capable of operating with only a fraction of the electricity needed by the vacuum tube, the "solid-state" device proved ideal for making not only inexpensive portable radios and tape recorders but computers as well. Indeed, without the transistor, the computer might never have advanced much beyond the bulky and fickle ENIAC, which was burdened with thousands of large vacuum tubes that consumed great amounts of power, generated tremendous quantities of heat, and frequently burned out. In an industry striving for miniaturization, the transistors, too, soon began to shrink. By 1960, engineers had devised photolithographic and other processes (see box) that enabled them to crowd many transistors as well as other electronic components onto a tiny silicon square.
The advent of such integrated circuits (ICs) drastically reduced the size, cost and electrical drain of any equipment in which they were used. One immediate byproduct: a new generation of small, desk-size minicomputers as well as larger, high-speed machines. Their speed resided in the rate at which electric current races through wire: about one foot per billionth of a second, close to the velocity of light. Even so, an electrical pulse required a significant fraction of a second to move through the miles of wiring in the early, large computers. Now even circuitous routes through IC chips could be measured in inches--and traversed by signals in an electronic blink. Computers with ICs not only were faster but were in a sense much smarter. Crammed with more memory and logic circuitry, they could take on far more difficult workloads.
Like the tracks in a railroad yard, ICs were really complex switching systems, shuttling electrical pulses hither and yon at the computer's bidding. Still, ICs could not function by themselves; other electronic parts had to keep the switches opening and closing in proper order. Then came the next quantum leap in miniaturization: the development in the late 1960s of large-scale integration (LSI). Unlike their single-circuit predecessors, which were designed to do only one specific job, LSIs integrated a number of circuits with separate functions on individual chips. These in turn were soldered together on circuit boards. Out of such modules, entire computers could be assembled like Erector sets.
But the new LSls had an innate drawback. Because they were made in rigid patterns and served only particular purposes --or were, as engineers say, "hard-wired"--they lacked flexibility. That limitation was ingeniously solved by the work of Hoff and others on microprogramming--storing control instructions on a memory-like chip. For the first time, computer designers could produce circuitry usable for any number of purposes. In theory, the same basic chip could do everything from guiding a missile to switching on a roast.
Such computational prowess seems dazzlingly unreal, and reinforces the popular image of computers as electronic brains with infinite intelligence. Yet most scientists regard computers, including those on chips, as dumb brutes. "They do only what they are told," insists Louis Robinson, director of scientific computing at IBM's data-processing division, "and not an iota more." What all computers, large and small, do extremely well is "number crunching"; they can perform prodigious feats of arithmetic, handling millions of numbers a second. Equally important, they can store, compare and arrange data at blinding speed. That combination lets the computer handle a broad range of problems--from designing a complex new telescopic lens to sending TV images across the solar system.
Humans have been calculating since the dawn of history --and before. Stone age man, making scratches on animal bones, tried to keep track of the phases of the moon. Other prehistoric people reckoned with pebbles. Indeed, the Latin word calculus means a stone used for counting. Perhaps the most enduring calculating device is the abacus, which was used in China as early as the 6th century B.C. But the first really serious efforts to make mechanical calculators, in which some of the tallying was done automatically, did not come until the 17th century.
By then numbers had become especially important because of great advances in astronomy, navigation and other scientific disciplines. More than ever before, it was necessary to rely on long tables of such elementary mathematical functions as logarithms, sines and cosines. Yet compiling these essential tools often required years of slavish toil.
Still, mathematical illiteracy continued to plague Europe.
In the early 19th century, Charles Babbage, an idiosyncratic mathematician and inventor of the railroad cowcatcher and the first tachometer, was becoming increasingly incensed by the errors he found in insurance records, logarithm tables and other data. His fetish for accuracy was so great, in fact, that after reading Lord Tennyson's noted line "Every moment dies a man/ Every moment one is born," he wrote the poet: "It must be manifest that if this were true, the population of the world would be at a standstill." Babbage's recommended change: "Every moment dies a man/ Every moment 1 1/16 is born."
In 1822, Babbage began work on a machine, called the difference engine, that could help solve polynomial equations to six places. The Chancellor of the Exchequer was so impressed by the machine's potential for compiling accurate navigational and artillery tables that he subsidized construction of a still larger difference engine that could compute to 20 places. Unfortunately, the metalworkers of Babbage's day were not up to making the precision parts required, and the machine was never completed. But Babbage had a bolder dream: he wanted to build a machine, which he dubbed the analytical engine, that could perform any arithmetical and logical operations asked of it. In effect, it would have been programmable--that is, a true computer instead of a mere calculator.
To "instruct" the machine, Babbage borrowed an idea that had just revolutionized the weaving industry. Using a string of cards with strategically placed holes in them, like those in a piano roll, the Frenchman Joseph Marie Jacquard automatically controlled which threads of the warp would be passed over or under with each pass of the shuttle. Babbage planned to use the same technique to program his machine; instead of the positions of threads, the holes in his cards would represent the mathematical commands to the machine. Wrote Babbage's mathematically knowledgeable friend Lady Lovelace, daughter of the poet Lord Byron: "We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves."
Babbage's loom, alas, never wove anything. By the time the eccentric genius died in 1871, he had managed to put together just a few small parts; only his elaborate drawings provide a clue to his visionary machine. Indeed, when Harvard and IBM scientists rediscovered Babbage's work in the 1940s while they were building a pioneering electromechanical digital computer called Mark I, they were astonished by his foresight. Said the team leader, Howard Aiken: "If Babbage had lived 75 years later, I would have been out of a job."
The Harvard machine occupied a large room and sounded, in the words of Physicist-Author Jeremy Bernstein, "like a roomful of ladies knitting." The noise came from the rapid opening and closing of thousands of little switches, and it represented an enormous information flow and extremely long calculations for the time. In less than five seconds, Mark I could multiply two 23-digit numbers, a record that lasted until ENIAC'S debut two years later. But how? In part, the answer lies in a beguilingly simple form of arithmetic: the binary system. Instead of the ten digits (0 through 9) of the familiar decimal system, the computer uses just the binary's two symbols (1 and 0). And with enough Is and 0s any quantity can be represented.
In the decimal system, each digit of a number read from right to left is understood to be multiplied by a progressively higher power of 10. Thus the number 4,932 consists of 2 multiplied by 1, plus 3 multiplied by 10, plus 9 multiplied by 10 X 10, plus 4 multiplied by 10 x 10 X 10. In the binary system, each digit of a number, again read from right to left, is multiplied by a progressively higher power of 2. Thus the binary number 11010 equals 0 times 1, plus 1 times 2, plus 0 times 2X2, plus 1 times 2X2X2, plus 1 times 2X2X2X2 --for a total of 26 (see chart).
Working with long strings of Is and 0s would be cumbersome for humans--but it is a snap for a digital computer. Composed mostly of parts that are essentially on-off switches, the machines are perfectly suited for binary computation. When a switch is open, it corresponds to the binary digit 0; when it is closed, it stands for the digit 1. Indeed, the first modern digital computer completed by Bell Labs scientists in 1939 employed electromechanical switches called relays, which opened and closed like an old-fashioned Morse telegraph key. Vacuum tubes and transistors can also be used as switching devices and can be turned off and on at a much faster pace.
But how does the computer make sense out of the binary numbers represented by its open and closed switches? At the heart of the answer is the work of two other gifted Englishmen.
One of them was the 19th century mathematician George Boole, who devised a system of algebra, or mathematical logic, that can reliably determine if a statement is true or false. The other was Alan Turing, who pointed out in the 1930s that, with Boolean algebra, only three logical functions are needed to process these "trues" and "falses";--or, in computer terms, Is and 0s.
The functions are called AND, OR and NOT, and their operation can readily be duplicated by simple electronic circuitry containing only a few transistors, resistors and capacitors. In computer parlance, they are called logic gates (because they pass on information only according to the rules built into them). Incredible as it may seem, such gates can, in the proper combinations, perform all the computer's high-speed prestidigitations.
The simplest and most common combination of the gates is the half-adder, which is designed to add two Is, a 1 and a 0, or two 0s (see diagram). If other half-adders are linked to the circuit, producing a series of what computer designers call full adders, the additions can be carried over to other columns for tallying up ever higher numbers. Indeed, by using only addition, the computer can perform the three other arithmetic functions. Multiplication is often accomplished by repeated additions, division by repeated subtractions. Subtraction, on the other hand, can be done by an old trick known in the decimal system as "casting out nines"-- taking the nines complement of the number to be subtracted and then adding 1 to the result. The operation is even easier with binary numbers; the complement is obtained by changing all Is to 0s and all 0s to Is.
Though it worked on the decimal rather than the binary system, Babbage's analytical engine was also a digital computer. Numbers were represented by the turns of gears and cogs and the positions of levers. Had Babbage ever succeeded in building his engine, it might have been as big as a football field, would have been powered by steam, and would have sounded as noisy as a boiler factory. Yet the same principles underlying the clangorous computations it would have made can be found in today's silent electronic wiz ards, all of which contain five basic sections:
Input. This section translates information from a variety of devices into a code that the computer understands. In Babbage's scheme, the manual turning of counters or use of punched cards provided the input.
Today such cards, as well as punched tape, are still used. But they have been supplemented by other methods, including magnetic tapes, discs and drums; the precisely tuned beep-beeps of the Touch-Tone telephone (whose lower left and right buttons have been reserved for computer communications and other information processing); the familiar keyboard-and-TV unit; optical scanners that can "read" characters at high speeds; electronic ears that can recognize a limited number of spoken words.
In every case, the object is the same: to translate information --letters, numbers, images, sounds, marks or simply magnetized ink on a check--into patterns of electrical pulses that are comprehensible to the computer.
Memory-Babbage dubbed this unit the store, and it does just that; it stores information until it is needed by other parts of the machine. For nearly two decades the most popular memory in modern computers has been the magnetic core variety. It consists of thousands of tiny iron rings, each one encircling an intersection of two wires in a rectangular grid made up of thousands of wires. Depending on the direction of current in the two wires that pass through its hole, each doughnut is magnetized in either a clockwise or counterclockwise direction. This represents either a 1 or a 0--a "bit" (for binary digit) of information. Because each core has a specific location in the precisely designed grid, it can be "addressed" almost instantly: information can be read from any doughnut by means of a third wire passing through each core. These fragile and expensive core memories are now being replaced by semiconductor memories on chips. In addition to such "random access" memories, as they are called, computers have auxiliary memories in the form of magnetic tape or discs. These have the advantage of large capacity and low cost, and are used to store information in bulk.
Arithmetic and Logic. To handle, direct and process the flood of information, the computer relies on this unit, which Babbage dubbed the mill. It is here that the computer does its number crunching and data manipulation.
Control. This is the computer's traffic cop. It gets instructions stored in the memory section and interprets them; it regulates the memory and arithmetic-logic sections and the flow of information between them, and orders processed data to move from the memory to the output section.
Output. Processed data are translated by this section into electrical impulses that can control an almost endless variety of devices. Thus the output may take the form of words or numbers "read out" on high-speed printers or glowing cathode-ray tubes. It can also emerge as an artificial voice, commands to an airplane's steering mechanism or even directions to another computer.
While Babbage's engine also included the concept of programmed instructions, today's machines are significantly different as a result of a refinement proposed in the 1940s by the Hungarian-born mathematical genius John von Neumann. After seeing ENIAC, he suggested "writing" both the data to be handled by the computer and the instructions for doing the job in the same memory and using the same code. It was a key innovation in computer theory, for it meant that the machine could cope with instructions just as if they were data. As Texas Instruments' William C. Holton explains, "A program can therefore alter another program or even itself."
Indeed, the act of creating computer programs--or "software"-- has become a major preoccupation of computer scientists. At the University of Pennsylvania, null designers, J. Presper Eckert Jr. and John W. Mauchly, had to alter the wiring manually, almost as if they were telephone operators rearranging the plugs on a switchboard, when they wanted to give the machine new instructions. Now, as the result of the development of an increasingly sophisticated hierarchy of computer languages such as FORTRAN, COBOL, BASIC and APL, computer users are able to give the machine instructions that are more and more like spoken English.
Extraordinary as today's computers are, they will probably seem like dumbbells compared with those on the horizon. Computers may be improved, for example, by a charge-coupled device (CCD) developed by Bell Labs; it stores packets of electrical charge in movable chains, like the clothing on the automatic racks in dry-cleaning establishments. As the charges pass by a station, they can be "read." Experimental CCDs that store more than 65,000 bits per chip have been built. To cram still more information onto a chip, engineers are experimenting with a tool even more precise than photolithography: beams of electrons, which can be aimed and controlled by computer as they trace out the miniature circuitry. Another Bell innovation is the magnetic-bubble memory, in which microscopic pockets of magnetism ("bubbles") are created in a semiconducting material. Prodded by an external electric or magnetic field, the bubbles move along orderly pathways; as they pass fixed stations, the presence or absence of bubbles is read as coded information. A 250,000-bit experimental bubble memory has already been produced. In the future the contents of an entire encyclopedia, film library or record collection may be stored on a chip.
Looking in another direction, scientists at IBM and elsewhere are seeking to improve not only the computer's memory but its logical functions as well. One approach stems from predictions made in 1962 by a young British graduate student named Brian Josephson, who shared a Nobel Prize for the work. His ideas involve a physical phenomenon called electron tunneling. At temperatures close to absolute zero (--459.69DEG F.), he theorized, an electrical current--or flow of electrons--can tunnel through barriers that would ordinarily restrain them.
Scientists quickly realized that Josephson's theory could form the basis for wondrous superconducting switching devices. Depending on the presence or absence of a small magnetic field, electrons would cross from one side of the barrier to the other, as in a transistor, but with a significant difference: the amount of current in a Josephson junction would be infinitesimally small. That would keep down the amount of heat generated, and thus the circuitry could be even more tightly packed. By the late 1980s, IBM scientists envision tiny computers, refrigerated inside tanks of liquid helium, that operate a hundred times as fast as today's machines.
Where will it all end? Circuits in some densely packed chips are already so close that there is sometimes electron leakage between conductors--interfering with the proper working of the chip. Is technology fast reaching the limit of miniaturization? Computer scientists think not. They point to the stupendous amounts of data contained, for example, in a DNA molecule--or in one-celled animals and plants that are visible only under a microscope. Says M.I.T.'s Michael Dertouzos:
"Even the amoeba is a far smaller and far more powerful information processor than today's best chips." If nature can do it, scientists feel challenged to try it too. sb
This file is automatically generated by a robot program, so viewer discretion is required.