Monday, Jun. 25, 1990
Those Computers Are Dummies
By MICHAEL D. LEMONICK
Roger Penrose is hardly the sort of man who would normally excite much popular interest, let alone controversy. The shy, somewhat rumpled and unfailingly polite Oxford professor, 58, has spent most of his career spinning theories in the most abstruse areas of mathematics and physics. His contributions to both disciplines have earned him a sterling reputation among his colleagues. But his pursuits have been so far removed from the everyday world that few people outside his fields of expertise were even aware of his existence.
Until last fall, that is, when Penrose brought out the book The Emperor's New Mind, an in-depth discussion of the relationship between artificial intelligence, consciousness and the laws of physics. Despite its complexity and intellectual rigor, the book quickly jumped onto the best-seller list. And just like his friend and sometime collaborator Stephen Hawking, the once obscure Penrose suddenly found himself showered with publicity. The professor was so unaware of how much the book was earning that he asked his editor whether there was enough to cover a few thousand pounds for a new car.
In the ensuing months, he has also found himself under attack. Reason: Penrose's central conclusion is that computers will never think because the laws of nature do not allow it. That angers many artificial-intelligence researchers. M.I.T.'s Marvin Minsky, one of the field's pioneers, is downright hostile. Says he: "Penrose is O.K. when he talks about mathematics, but most of his evidence argues against his conclusions. As far as I can tell, he is just plain wrong." Stanford psychologist and AI researcher David Rumelhart is somewhat milder: "He defines intelligence too narrowly by saying it depends on consciousness."
Nonetheless, Penrose's reasoning is powerful, and he delves extensively into such heavy topics as fractal geometry, number theory, quantum physics, entropy and cosmology to give readers the necessary background to understand his ideas. "I have to admit," he says, "that I had been looking for an excuse to write about many aspects of physics and mathematics anyway, and this gave me one."
Penrose's first major point is that the human mind can reach insights that are forever inaccessible to computers. The reason is that all digital computers operate according to algorithms, or sets of rules that prescribe how to solve problems. Yet there are problems that cannot be approached by any system of rules, a fact shown in the 1930s by the mathematician Kurt Godel. Godel's theorem establishes that in any mathematical system there must be certain propositions that are obviously true but that can never be proved within the rules of the system.
Mathematician Alan Turing made a related discovery in the 1950s when he used his Turing machine -- an imaginary, simple computer -- to prove that there are some mathematical problems that are solvable but that cannot be solved even in principle by a digital computer. Says Penrose: "The very fact that the mind leads us to truths that are not computable convinces me that a computer can never duplicate the mind" -- this despite the fact that the human brain is often described as a particularly complex computer.
As Penrose freely admits, his other main theory is far more speculative. It holds that consciousness and insight, which he says are beyond the capabilities of computers, are governed by as yet undiscovered laws of physics. The idea that computers are necessarily unconscious and without insight is largely based on his own experience in solving abstract puzzles. And it is true that these mental processes are not explained by existing laws of physics. The answers will come, says Penrose, with the merger of Einstein's theory of relativity, which concerns itself with gravity, and quantum theory, which governs the sub-microscopic world. These two theories are mathematically incompatible, and physicists are hard at work trying to create a quantum version of gravity.
One consequence could be to establish the boundaries of quantum mechanics, which says particles can suddenly jump from one place to another without traversing the space in between. Penrose's intuition, although he has no proof, is that these effects may apply not just to atoms but also to objects as big as brain cells. An act of creative thinking, he argues, could be the outward manifestation of neurons making quantum jumps from one energy state to another. Since computers do not operate by quantum rules, he says, they will never have insights.
Moreover, he believes, quantum gravity could be behind consciousness itself. The argument is tricky, but one reason for this belief is that consciousness carries with it a peculiarity that baffles physics: humans perceive time as moving forward rather than backward. But virtually all the laws of physics are time symmetric -- they work equally well in forward or in reverse -- and the mind presumably operates by physical laws.
Penrose's answer is that when quantum gravity is finally constructed, it will prove to be time asymmetric -- that is, it will not work in reverse. Why? Because the Big Bang that started the universe must, at its earliest moments, have been governed by quantum gravity. And the Big Bang was surely a time- asymmetric phenomenon that could not happen in reverse. If quantum gravity winds up being the theory governing the mind, that will also explain why time moves forward, not backward.
In short, Penrose believes human creativity and consciousness are nothing less than the perceptible workings of the most basic laws of the universe. It is a bolder position than other physicists are prepared to take, but Penrose likes to be different. Says he: "Worrying about things that no one else worries about is where insights come from."