The American physicist Mitchell Jay Feigenbaum (born 1944) laid the foundations for studying the world of complicated events in nature by recognizing patterns underlying the application of mathematical equations.
Mitchell Jay Feigenbaum was born in Philadelphia, Pennsylvania, on December 19, 1944. His father was a chemist working for the government and subsequently for industry, while his mother taught in the public schools. Feigenbaum proceeded from Samuel J. Tilden High School to the City College of New York, whereby he received a Bachelor's degree in electrical engineering in 1964. Although that field had been his first love, he found his tastes moving in the direction of physics and went to the Massachusetts Institute of Technology for his graduate work. He earned his doctorate in elementary particle physics there in 1970 and took a position at Cornell the same year.
There was little to distinguish Feigenbaum's career at Cornell and later at Virginia Polytechnic Institute. Although he felt a strong attachment to the study of hard problems (phenomena governed by equations more complicated than traditional linear equations), he had not been able to publish much on the subject. It was not clear how to approach the problems in which he was interested since the classical methods of physics were not applicable to nonlinear equations.
Almost simultaneously with his move to the National Laboratory in Los Alamos, New Mexico, however, Feigenbaum was inspired with a method of approach to nonlinear phenomena. The computers that were in use around him could perform complicated tasks by a sequence of simple steps. The question that Feigenbaum asked himself dealt with how the computer would handle the same computation if it were repeated a large number of times. Without necessarily being able to predict what would happen, he felt that the results might illustrate the behavior of nonlinear systems.
What he discovered was that if two numbers very close together were plugged into the same formula, it did not require a large number of repetitions of the formula for the values to be quite far removed from one another. This kind of behavior had been known to occur experimentally in nonlinear phenomena but Feigenbaum's results were the closest to a theoretical model that anyone had come. So far at least, however, there was not much by way of explanation of why nonlinear equations should behave this way.
In 1975 Feigenbaum heard a talk by the mathematician Stephen Smale, who had already contributed to both pure and applied mathematics. Putting Smale's theoretical work together with his own observations led Feigenbaum to an intense period of work in the spring of 1976, during which he studied with care the behavior of large numbers of values when treated with repeated applications of the same simple but nonlinear equations. The pictures that emerged from studying the behavior of the values convinced Feigenbaum that they reflected how nonlinear systems behaved.
Although he had always been interested in numbers and calculations, the work that Feigenbaum was doing and the discoveries that he was making were not readily accepted as mathematics by the mathematical community. At the same time, because he seemed to be studying computations themselves and not their physical significance, it was not always clear to physicists that he was doing physics. A new branch of science emerged, situated somewhere on the border of mathematics and physics with a heavy dose of computer science. The name attached to the new domain was "chaos theory," referring to the apparently disordered behavior of nearby points. The moral of the new subject, however, was that the chaos was only apparent and gave rise to patterns of regularity when studied more generally.
The pictures that Feigenbaum generated turned out to have the feature of looking the same at different scales. Mathematical curves with this property had been given the name "fractals" and had received some attention earlier in the 20th century. As a result, Feigenbaum was able to use some of the work done earlier to describe the patterns that he had discovered. At the same time his work gave a great impetus to the mathematical study of fractals, and mathematicians followed in Feigenbaum's footsteps. Feigenbaum's reluctance to spend time looking for proofs of his results left mathematicians with plenty to do, and Oscar Lanford III supplied some of the fundamental proofs for chaos theory in 1979.
Feigenbaum returned to Cornell University in 1982. He retained his ties to Los Alamos into the 1990s, but also took a position as professor of mathematics and physics at Rockefeller University in 1986. His accomplishments led to his spending time as a visiting researcher at the Institute for Advanced Study in Princeton and the French IHES. He and Benoit Mandelbrot of IBM share much of the credit for the study and popularization of chaos theory and fractals, although there is disagreement about exactly how the credit should be assigned. Further recognition came in the form of awards. He received a MacArthur foundation award in 1984 and a Wolf Foundation prize in physics in 1986.
Nonlinear phenomena (that is, events whose behavior seems to be governed by nonlinear equations) occur throughout nature. Among the best known applications is that of weather prediction, a proverbially inexact science. Chaos theory has not been able to bring about improvements in the ability to predict the weather, but it supplies a theoretical basis for the difficulties. Insofar as physics is about the quest for understanding, Feigenbaum's work is in the grand tradition of physics and has a universality that cuts across disciplines. His observations are at the heart of theoretical limitations to the predictive power of science.
Further Reading on Mitchell Jay Feigenbaum
Much of the work by Feigenbaum remains either unpublished or in the form of journal articles, but there is an excellent chapter in James Gleick's book Chaos (1988) on Feigenbaum. It talks about both Feigenbaum's contributions to chaos theory and the way his personality is interwoven with the way he does science.