###### 25 November 2005

# Mathematics In Crisis - Where To Now?

##### By Rusty Rockets

It seems that mathematics, pure mathematics at least, is destined to become an outmoded way in which we predict and model the physical Universe with any certainty. It is predicted that mathematical proofs will become so difficult to formally verify, either by computers or humans, that certainty - if you could still call it that - and understanding may come down to mere social consensus. It's interesting that this revelation is taking place at a time when devices to detect the physical world as it *really* is, not as it *probably* is, are being developed. As you read this article, humungous telescopes and intricately designed quantum detectors are revealing the macro and micro worlds before our very eyes!

This apparent mathematical dilemma is not as dire as it may first seem, and pure mathematics will of course maintain its role as the language of the sciences. It is also a tad disingenuous to suggest that either mathematics or scientific observation can be exclusive of one another. This is because mathematics either explains complex physical systems or acts as a god-like finger, pointing scientists in the direction of as yet unseen natural phenomena. This is why chicken and egg arguments in regard to mathematics and experimental science are redundant. One might be tempted to argue that making an observation of physical phenomena always comes first, but that would be ignoring the fact that much of what scientists seek in the Universe, or in the sub atomic realm, has initially been predicted by mathematical models. This reasoning reflects an age-old dispute between empiricists and rationalists. Renee Descartes, for example, believed that making predictions about the world was possible through intuition alone, and drew a sharp distinction between ideas derived from the material world and those that were more inherent.

Debates also rage as to the nature of mathematics itself, and recently Brian Davies, Professor of mathematics at King's College in England, stated that: "Whole books have been devoted to the discussion of the relationship between ontology and epistemology in mathematics, but it is fair to say that agreement about its solution is not imminent." Davies also draws attention to the philosophical positions that exist within the field of mathematics and those who adopt and advocate such concepts. "As representative of many others we cite Roger Penrose as a committed realist (i.e., Platonist) and Paul Cohen as an anti-realist. Einstein was clear that mathematics was a product of human thought and that, as far as the propositions of mathematics are certain, they do not refer to reality." Davies adds that, philosophical disputes aside, as mathematicians, they more or less get along just fine, despite minor skirmishes among other perceived hierarchies within the field of mathematics. Davies remarks that: "constructivists adopt a strict, algorithmic notion of existence that is more acceptable to applied mathematicians, numerical analysts, and logicians than it is to most pure mathematicians."

That said, much of what we know about relatively recent developments in regard to the Universe and quantum behaviors (admittedly, not much to speak of) is based on mathematical models. In fact recently, scientists have verified a prediction made by Einstein in regard to dark energy, and in the process have vindicated one of his equations that many - himself included - considered plain wrong. Einstein added what is known as the "cosmological constant" to his theory of general relativity to account for a Universe at dynamical equilibrium, because such a Universe would begin to contract as a result of gravity. However, a later discovery by Edwin Hubble showed that the Universe is not at equilibrium and is actually expanding. Furthermore, due to the uneven distribution of matter in the Universe Einstein's constant does not result in equilibrium in any case. In light of this evidence, Einstein considered his constant as his "greatest blunder", but his cosmological constant remained of interest to scientists, as they later noticed through red-shift observation that the Universe's expansion is actually accelerating. While there are competing explanations for this acceleration, many believe that Einstein's cosmological constant might be the explanation.

The point of this brief scientific history lesson is the way in which Einstein's constant was recently confirmed: empirical observation. According to the SuperNova Legacy Survey (SNLS), the mysterious dark energy that drives the accelerating expansion of the Universe behaves just like Einstein's famed cosmological constant. A team of international researchers, located in France and Canada, collaborating with large telescope astronomers at Oxford, Caltech and Berkeley, revealed that the dark energy behaves like Einstein's cosmological constant to a precision of 10 percent. The team consider this confirmation as very significant: "Our observation is at odds with a number of theoretical ideas about the nature of dark energy that predict that it should change as the Universe expands, and as far as we can see, it doesn't," said Professor Ray Carlberg of the Department of Astronomy and Astrophysics at the University of Toronto (U of T). If you were competitive by nature, as some scientists often are, you might say that's one in the eye for rationalists. But again, would scientists have made this discovery had they not been tipped-off about what to look for?

The next generation of astronomical cameras and telescopes will have huge implications for how scientists understand the Universe. Researchers working on the SNLS project know that measuring the distance to faraway supernovae is a key tool for cosmologists. Supernovae are exploding stars, known to have similar brightness whatever their location in other galaxies. Observing these exploding stars can thus make it possible to measure their distances: they are known as "standard candles" for measuring long distances in the Universe. The researchers who verified Einstein's constant, for example, used a 340-million pixel camera called MegaCam. "Because of its wide field of view - you can fit four moons in an image - it allows us to measure simultaneously, and very precisely, several supernovae, which are rare events," said Pierre Astier, one of the scientists with the Centre National de la Recherche Scientifique (CNRS) in France.

We should begin to get used to the idea of empiricism reclaiming its (rightful?) position as king of science, as development of giant telescopes and detectors that can measure quantum states without causing decoherence races ahead. Right now, Caltech, University of California (UC), the Association of Universities for Research in Astronomy (AURA), and the Association of Canadian Universities for Research in Astronomy (ACURA) are working on a project called the Thirty Meter Telescope (TMT). Ok, so the size of the telescope is fairly self-evident (approximately 98.5 feet), but the specs and capability of such a scope are truly mind-boggling. The TMT team says that the goal of the project is to construct an extremely large telescope based on more than 700 hexagonal-shaped mirror segments that stretch a total of 98.5 feet in diameter. The reason for using the many mirrored approach is a technical one, as traditional telescopes are limited to using single mirrors 26 feet in diameter. These giant telescopes also need adaptive optics systems that compensate for natural distortions of the incoming light by the Earth's atmosphere, and science instruments containing dozens of mirrors, detectors, and complex filters. The TMT will take 10 years to build.

While optics are advancing in a positive fashion, Davies argues that mathematics has been assailed by problems that may never be resolved, possibly demonstrating the limitations of the human mind. To be fair, Davies makes this argument with the understanding that: "Pure mathematics will remain more reliable than most other forms of knowledge, but its claim to a unique status will no longer be sustainable." Notwithstanding this, the 20th century has witnessed at least three crises that shook the foundations on which the certainty of mathematics seemed to rest. The first was the work of Kurt Goedel, who proved in the 1930s that any sufficiently rich axiom system is guaranteed to possess statements that cannot be proved or disproved within the system. The second crisis concerned the Four-Color Theorem, whose statement is so simple a child could grasp it, but whose proof necessitated lengthy and intensive computer calculations. A conceptual proof that could be understood by a human without such computing power has never been found. Many other theorems of a similar type are now known, and more are being discovered every year. Davies argues that there are a number of mathematical inconsistencies that have been conveniently forgotten that must be acknowledged - let alone resolved.

We are witnessing a profound and irreversible change in mathematics, Davies argues, which will affect decisively its character. Mathematics will be seen as "the creation of finite human beings, liable to error in the same way as all other activities in which we indulge. Just as in engineering, mathematicians will have to declare their degree of confidence that certain results are reliable, rather than being able to declare flatly that the proofs are correct." Furthermore, Davies makes a stark prediction based on the historical accumulation of mathematical knowledge:

"By 2075 many fields of pure mathematics will depend upon theorems that no mathematician could fully understand, whether individually or collectively� Formal verifications of complex proofs will be commonplace, but there will also be many results whose acceptance will owe as much to social consensus as to rigorous proof."

Of course, empiricism has its own flaws. Rationalists make a good case, in some respects, in arguing that sense experience does not always equate to perfect knowledge, far from it in fact. You need only stare at a simple optical illusion for confirmation that humans are not equipped to "see" all things as they are. There is a huge corpus of philosophical enquiry that questions the veracity of the mediums through which we observe and sense our Universe. What we see through a microscope, for example, is not any more verifiably "real" than intuitive ideas not derived from the material world. There is certainly some validity to arguments like this.

Rationalism and empiricism are not mutually exclusive, and to truly understand the world, we need to refer to both. John Barrow, Professor of Astronomy at the University of Sussex, and author of *Impossibility*, writes that: "In any Universe which order of any sort exists, and hence in any life supporting Universe, there must be pattern, and so there must be mathematics." This explanation is not putting the horse before the cart, but shows how we as humans "read" nature through the language of mathematics. Barrow adds that this situation: "reveals why all discussions of the Universe and its contents lead so quickly and inevitably to mathematics: no science exists without it."

Further reading:

TMT - http://www.tmt.org/

Brian Davies: *Whither Mathematics?* - http://www.ams.org/staff/jackson/comm-davies.pdf