Monday, Jun. 11, 1990
Dashed Hopes and Bogus Fears
By EUGENE LINDEN
At a time when data moved on horses and ships, Samuel Morse inaugurated the information age in 1835 by translating messages into electric signals and telegraphing them at nearly the speed of light. With 20/20 hindsight, it is tempting to view today's networked and digitized world as the inevitable culmination of Morse's breakthrough technology. That would be a mistake, however. Technological change has been marked by fits, starts and left turns, and the clues to the future have often been hidden in the clutter of the present.
This is the message of a new $10 million permanent exhibit at the Smithsonian Institution's National Museum of American History in Washington. Titled "Information Age: People, Information & Technology," the show brings together 700 objects and artifacts, ranging from Morse's telegraph to an early Apple computer. Through re-created scenes and videos, the exhibition tries to capture the mood of each period during the information age, which has repeatedly confounded both the hopes and fears of society. "Our goal was to display technology as a human enterprise," says curator David Allison, "subject to all the foibles and failures of people."
In the early 1870s, Alexander Graham Bell and Elisha Gray both discovered the possibility of voice transmission, but Gray ignored the potential of the telephone in the erroneous belief that telegraphy would remain the dominant means of communication. In the early 1950s, technicians used the newly invented transistor simply as a substitute for bulky vacuum tubes. Only later did designers realize that the transistor could revolutionize electrical engineering by providing a tiny, universal electronic component that could be organized into integrated circuits and programmed to perform millions of different tasks.
Predictions of the social impact of mass communications and computers were equally myopic. Bernard Finn, of the Smithsonian's division of electricity, notes that the sending of the first transatlantic cable message in 1858 was widely hailed as an event that would introduce an era of world peace because it would enhance communication between different peoples. Shortly afterward, the U.S. Civil War broke out, and the opposing armies took over telegraph offices, establishing a coupling between information technology and warfare that continues to the present.
If the information age has not lived up to early hopes, neither has it justified later fears. Instead of mass unemployment, automation permitted the expansion of the economy and created new jobs. Abuse of the electronic media did not create the thought-controlled world predicted by George Orwell in 1984. Decades of attempts to control information in the Soviet Union backfired in the worst possible way: the government could not convince people they lived in a worker's paradise, but it could dampen the knowledge flow sufficiently to stifle the innovation necessary for a robust economy.
Though sweeping, the actual transformations wrought by computers and mass communications have been more subtle than predicted. Between 1860 and 1980, the proportion of the U.S. economy derived from information processing and communications rose from 7% to more than 50%, creating a demand for a new type of worker. Computers and communications equipment do not require strength or aggressiveness, and this has helped transform the role of women in industrial societies. These changes go on today at the edges of the information age. In New Guinea, for example, rural men skilled in warfare and hunting are by turns mystified and mortified when they have to deal with women as equals if not superiors in modern banks and offices.
In America computers and the media continue to reshape life. Still, the very ubiquity of information technologies has also exposed their limitations. Businesses and policymakers, awash in data and images, have discovered that information is not useful without expertise. With the most sophisticated intelligence-gathering tools at its disposal, the CIA could not accurately portray the disarray of the Soviet economy or predict the collapse of communism. Instead of making people redundant, the high-tech economy has only underscored the irreplaceable contributions of human knowledge and common sense.
In recent years computer companies have begun to sell software systems that emulate specific human expertise, raising fears that experts will be the next group to be replaced by computers. But there is little reason for concern. Unfounded claims that an artificial intelligence is around the corner date to the 19th century. The more likely result is that these new systems will extend, rather than replace, human knowledge, freeing experts to work on novel problems.
People will continue to err when predicting the future if only because of the human tendency to fit new events into familiar categories. In a celebrated 1950s experiment, psychologist Jerome Bruner showed that ordinary people would "see" a red ace of spades as a regular black one if it was salted into an otherwise normal deck. The Smithsonian exhibit demonstrates that inventors are fooled in the same way.
Allison and his fellow curators have wisely refrained from predicting the future, focusing instead on the discoveries that have brought humanity to its present juncture. Perhaps, though, one of the many schoolchildren visiting the exhibit will look with fresh eyes at its displays and have the flash of intuition that holds the key to the next technological revolution.