**John von Neumann** (December 28, 1903 – February 8, 1957) was an Austro-Hungarian-born American^{} mathematician who made major contributions to a vast range of fields,^{} including set theory, functional analysis, quantum mechanics, ergodic theory, continuous geometry, economics and game theory, computer science, numerical analysis, hydrodynamics (of explosions), and statistics, as well as many other mathematical fields. He is generally regarded as one of the foremost mathematicians of the 20th century.^{} The mathematician Jean Dieudonné called von Neumann "the last of the great mathematicians."^{} Even in Budapest, in the time that produced geniuses like Szilárd (1898), Wigner (1902), and Teller (1908), his brilliance stood out.^{}

Most notably, von Neumann was a pioneer of the application of operator theory to quantum mechanics, a principal member of the Manhattan Project and the Institute for Advanced Study in Princeton (as one of the few originally appointed), and a key figure in the development of game theory^{} and the concepts of cellular automata^{} and the universal constructor. Along with Edward Teller and Stanislaw Ulam, von Neumann worked out key steps in the nuclear physics involved in thermonuclear reactions and the hydrogen bomb.

Von Neumann's hydrogen bomb work was also played out in the realm of computing, where he and Stanislaw Ulam developed simulations on von Neumann's digital computers for the hydrodynamic computations. During this time he contributed to the development of the Monte Carlo method, which allowed complicated problems to be approximated using random numbers. Because using lists of "truly" random numbers was extremely slow for the ENIAC, von Neumann developed a form of making pseudorandom numbers, using the middle-square method. Though this method has been criticized as crude, von Neumann was aware of this: he justified it as being faster than any other method at his disposal, and also noted that when it went awry it did so obviously, unlike methods which could be subtly incorrect.

While consulting for the Moore School of Electrical Engineering on the EDVAC project, von Neumann wrote an incomplete set of notes titled the *First Draft of a Report on the EDVAC*. The paper, which was widely distributed, described a computer architecture in which the data and the program are both stored in the computer's memory in the same address space. This architecture became the de facto standard until technology enabled more advanced architectures. The earliest computers were 'programmed' by altering the electronic circuitry. Although the single-memory, stored program architecture became commonly known by the name von Neumann architecture as a result of von Neumann's paper, the architecture's description was based on the work of J. Presper Eckert and John William Mauchly, inventors of the ENIAC at the University of Pennsylvania.^{}

Von Neumann also created the field of cellular automata without the aid of computers, constructing the first self-replicating automata with pencil and graph paper. The concept of a universal constructor was fleshed out in his posthumous work *Theory of Self Reproducing Automata*.^{} Von Neumann proved that the most effective way of performing large-scale mining operations such as mining an entire moon or asteroid belt would be by using self-replicating machines, taking advantage of their exponential growth.

He is credited with at least one contribution to the study of algorithms. Donald Knuth cites von Neumann as the inventor, in 1945, of the merge sort algorithm, in which the first and second halves of an array are each sorted recursively and then merged together.^{}His algorithm for simulating a fair coin with a biased coin^{} is used in the "software whitening" stage of some hardware random number generators.

He also engaged in exploration of problems in numerical hydrodynamics. With R. D. Richtmyer he developed an algorithm defining *artificial viscosity* that improved the understanding of shock waves. It is possible that we would not understand much of astrophysics, and might not have highly developed jet and rocket engines without that work. The problem was that when computers solve hydrodynamic or aerodynamic problems, they try to put too many computational grid points at regions of sharp discontinuity (shock waves). The *artificial viscosity* was a mathematical trick to slightly smooth the shock transition without sacrificing basic physics.

## 0 comments: