Monday, Feb. 22, 1993

The New Field of Complexity May Explain Mysteries From the Stock Market to the Emergence of . . . Life, the Universe and Everything

By MICHAEL D. LEMONICK

If the basic rules of chemistry are any guide, life should not exist. Scientists showed in the 1950s that shooting an electric spark through a soup of chemicals -- thus simulating lightning strikes on the primordial planet earth -- could produce simple organic compounds. But complex, self-reproducing chemicals like dna? They shouldn't have arisen in a trillion years. At an even deeper level, the second law of thermodynamics dictates that the universe should inexorably move toward disorganization. Cups of tea always cool off; they never spontaneously get hotter. Iron rusts, but rust never turns into iron.

Yet over the eons, a chaotic universe organized itself into stars and galaxies and planets. And at least one planet, our own, is now bursting with life in bewildering varieties, filled with organisms that have arrayed themselves into ecosystems, communities and complex societies. How did this happen? That is the question posed by a brand-new field of science known as complexity.

The central idea is that self-organization is almost inevitable in a wide range of systems, both natural and man-made. The consistent shape of sand dunes marching across a desert, the evolution of complicated body parts such as eyes and kidneys, the equilibrium between supply and demand in a functioning economy and the existence of life itself -- all these may be expressions of this single principle.

The theory is compelling enough to have spawned its own research center, the Santa Fe Institute in New Mexico. And while some scientists dismiss complexity as just a trendy buzz word used to attract grant money, the field has drawn not only young hotshots but also Nobel laureates in physics, including Philip Anderson and Murray Gell-Mann, and Economics laureate Kenneth Arrow.

Complexity has been around for more than a decade, and its roots go back even further, but it is surging in popularity thanks largely to two popular books. They are, confusingly, Complexity, by M. Mitchell Waldrop, and Complexity, by Roger Lewin; both authors formerly wrote for the journal Science. Like James Gleick's wildly successful 1987 book Chaos, each volume attempts to convey to lay readers the basics of the science as well as the excitement it is generating among its practitioners. (Mini-review: Waldrop's book, a straightforward, detailed account, succeeds admirably; Lewin's, a chatty personal memoir, does not.)

Complexity theory and chaos theory share more than the attention of enterprising writers; they are scientific first cousins. The essence of chaos theory is that certain phenomena involve so many factors that they are inherently unpredictable; although a scientist may be able to project the pattern of a swinging pendulum or a flying cannonball, it is impossible to determine how far apart two leaves will be after they go through a waterfall or exactly what the weather will be a month from now. Reason: in systems governed by the mathematics of chaos, small events have big consequences. For instance, even the random firing of just a few neurons, say chaos theorists, can throw a normally beating heart into wildly irregular fibrillation. The best that scientists can do is recognize that the world's chaos follows certain patterns.

Complexity theory examines the systems that lie in the middle ground between the predictable and the chaotic -- in fact, right on the border between the two states. Says Edward Knapp, president of the Santa Fe Institute: "We think of a complex system as one that is probably never in equilibrium, a system with many interlocking parts that are not easily described by simple arithmetic."

One of the easiest examples to understand is sand dunes, which maintain their overall shape despite winds and sandslides. Researchers at IBM's Thomas Watson Research Center built an artificial dune, a tiny sandpile sitting on a sensitively balanced plate, to study this behavior in detail. In one experiment, they dropped 35,000 grains of sand onto the pile one by one. As the sides grew too steep -- in some cases, by only a single grain of sand -- avalanches would make the pile collapse. Then it would start growing steeper again, until it was time for the next avalanche.

This phenomenon is known as self-organized criticality -- the grains have organized themselves to slope at a certain angle, yet the arrangement is precarious because a tiny extra bit of sand can knock the whole thing down. The sandpile is not quite stable, not quite chaotic.

Complexity theorists believe more sophisticated phenomena follow the same pattern. The stock market can, without outside direction, hum along on an upward course for years and then crash 500 points in a single day. A species can survive for millions of years and then abruptly die out -- or conversely, evolve almost all at once into something entirely new. And self-reproducing organisms can somehow arise, against all odds, from a soup of simple organic chemicals.

It certainly makes intuitive sense that a simple underlying principle should explain such similar behaviors across a wide variety of systems. Indeed, complexity theorists often speak about their science "feeling" or even "tasting" right. Waldrop sometimes jokingly refers to the all- encompassing theory as "the Grand Unified Theory of Holism." The only trouble is, he says, "some people take me seriously."

The field might have remained a kind of New Age plaything for computer nerds were it not for the fact that it has stood up to a variety of tests. Some experiments have been done in the lab, as with tiny sandpiles. More often they take place inside computers. Scientists create mathematical models of real- world systems -- the stock market, an ecosystem, a group of living cells -- and let them evolve on the screen. If the computerized world behaves as the real one does, there is a good chance the underlying mathematics is valid.

By this measure, complexity works, at least roughly. Computer simulations of ( life, the best-known application of the theory, create onscreen worlds of cyber-creatures that evolve in ways that eerily parallel real life. Biophysicist Stuart Kauffman of the Santa Fe Institute says confidently, "Biological evolution proceeds at the boundary between order and chaos. If there is too much order, the system becomes frozen and cannot change. But if there is too much chaos, the system retains no memory of what went on before."

Another area where computerized worlds seem to mimic the real one is economics. J. Doyne Farmer, a physicist formerly at Los Alamos National Laboratory, has been struck by how the mathematics of complexity seems to explain the workings of the stock market, which, like a biological system, involves constant adaptation to change by individual participants. After playing with computer models, Farmer decided it was time for a reality test of the theory. He and several partners founded Prediction Co., an Albuquerque, New Mexico, investment firm that uses math to try to beat the financial markets. Says Farmer: "If I can be right 55% of the time, that's enough to make plenty of money." In dry runs the company has done even better. Armed with an undisclosed amount of venture capital, Prediction Co. has begun trading for real.

Even if Farmer gets rich, there will be skeptics who dismiss the idea that complexity is the scientific revolution its proponents claim. The critics, writes physicist and sometime Santa Fe Institute visitor Daniel Stein in the December issue of Physics Today, can rightly ask, "Why is it necessary to force ((these phenomena)) under a single umbrella?" Yet there can be no doubt that investigations of complexity and chaos have at least made things more interesting. Comments Rockefeller University physicist Mitchell Feigenbaum: "Now we see things we didn't notice before, and we ask questions we didn't know how to ask. And whenever we can think of new questions, we can do good science."

With reporting by J. Madeleine Nash/Chicago