John von neumann biography for kids

In Princeton, he received complaints for playing extremely loud German march music on his phonograph. Von Neumann did some of his best work in noisy, chaotic environments, including with his wife's phonograph playing loudly. Per Churchill Eisenhart, von Neumann could attend parties until the early hours of the morning and then deliver a lucid lecture at His daughter wrote in her memoirs that he was very concerned with his legacy in two aspects: her life and the durability of his intellectual contributions to the world.

He also maintained his knowledge of languages learnt in his youth. His Spanish was less perfect, but once on a trip to Mexico he tried to create his own "neo-Castilian" mix of English and Spanish. He had an encyclopedic knowledge of ancient history, and he enjoyed reading Ancient Greek historians such as Thucydides and Herodotus in the original Greek.

Ulam suspected they may have shaped his views on how future events could play out and how human nature and society worked in general. Von Neumann's closest friend in the United States was the mathematician Ulam. Von Neumann believed that much of his mathematical thought occurred intuitively; he would often go to sleep with a problem unsolved and know the answer upon waking up.

Since , it has housed the John von Neumann Computer Society. He also lectured on the theory of relativity, set theory, integral equations and analysis of infinitely many variables. Flow chart from von Neumann's "Planning and coding of problems for an electronic computing instrument," published in The first implementation of von Neumann's self-reproducing universal constructor.

Three generations of machine are shown: the second has nearly finished constructing the third. The lines running to the right are the tapes of genetic instructions, which are copied along with the body of the machines. A simple configuration in von Neumann's cellular automaton. He told Nachman Aronszajn and K. Smith that in the early s he proved the existence of proper invariant subspaces for completely continuous operators in a Hilbert space while working on the invariant subspace problem.

With I. Schoenberg he wrote several items investigating translation invariant Hilbertian metrics on the real number line which resulted in their complete classification. Their motivation lie in various questions related to embedding metric spaces into Hilbert spaces. With Pascual Jordan he wrote a short paper giving the first derivation of a given norm from an inner product by means of the parallelogram identity.

Later with Robert Schatten he initiated the study of nuclear operators on Hilbert spaces, [ ] [ ] tensor products of Banach spaces , [ ] introduced and studied trace class operators, [ ] their ideals , and their duality with compact operators , and preduality with bounded operators. While his original ideas for rings of operators existed already in , he did not begin studying them in depth until he met F.

Murray several years later. The six major papers in which he developed that theory between and "rank among the masterpieces of analysis in the twentieth century"; [ ] they collect many foundational results and started several programs in operator algebra theory that mathematicians worked on for decades afterwards. An example is the classification of factors.

Between and , von Neumann worked on lattice theory , the theory of partially ordered sets in which every two elements have a greatest lower bound and a least upper bound. As Garrett Birkhoff wrote, "John von Neumann's brilliant mind blazed over lattice theory like a meteor". Many previously geometric results could then be interpreted in the case of general modules over rings.

His work laid the foundations for some of the modern work in projective geometry. His biggest contribution was founding the field of continuous geometry. In mathematics, continuous geometry is a substitute of complex projective geometry , where instead of the dimension of a subspace being in a discrete set 0 , 1 ,. Earlier, Menger and Birkhoff had axiomatized complex projective geometry in terms of the properties of its lattice of linear subspaces.

Von Neumann, following his work on rings of operators, weakened those axioms to describe a broader class of lattices, the continuous geometries. Von Neumann was motivated by his discovery of von Neumann algebras with a dimension function taking a continuous range of dimensions, and the first example of a continuous geometry other than projective space was the projections of the hyperfinite type II factor.

Dimension is determined, up to a positive linear transformation, by the following two properties. It is conserved by perspective mappings "perspectivities" and ordered by inclusion. The deepest part of the proof concerns the equivalence of perspectivity with "projectivity by decomposition"—of which a corollary is the transitivity of perspectivity.

This is known as the Veblen—Young theorem. Von Neumann extended this fundamental result in projective geometry to the continuous dimensional case. This conclusion is the culmination of pages of brilliant and incisive algebra involving entirely novel axioms. Anyone wishing to get an unforgettable impression of the razor edge of von Neumann's mind, need merely try to pursue this chain of exact reasoning for himself—realizing that often five pages of it were written down before breakfast, seated at a living room writing-table in a bathrobe.

This work required the creation of regular rings. Many smaller technical results were proven during the creation and proof of the above theorems, particularly regarding distributivity such as infinite distributivity , von Neumann developing them as needed. He also developed a theory of valuations in lattices, and shared in developing the general theory of metric lattices.

Birkhoff noted in his posthumous article on von Neumann that most of these results were developed in an intense two-year period of work, and that while his interests continued in lattice theory after , they became peripheral and mainly occurred in letters to other mathematicians. He never wrote up the work for publication. Von Neumann made fundamental contributions to mathematical statistics.

In , he derived the exact distribution of the ratio of the mean square of successive differences to the sample variance for independent and identically normally distributed variables. Subsequently, Denis Sargan and Alok Bhargava extended the results for testing whether the errors on a regression model follow a Gaussian random walk i. In his early years, von Neumann published several papers related to set-theoretical real analysis and number theory.

The first dealt with partitioning an interval into countably many congruent subsets. Consequently, there exists a perfect algebraically independent set of reals the size of the continuum. Von Neumann was the first to establish a rigorous mathematical framework for quantum mechanics , known as the Dirac—von Neumann axioms , in his influential work Mathematical Foundations of Quantum Mechanics.

He realized in that a state of a quantum system could be represented by a point in a complex Hilbert space that, in general, could be infinite-dimensional even for a single particle. In this formalism of quantum mechanics, observable quantities such as position or momentum are represented as linear operators acting on the Hilbert space associated with the quantum system.

The physics of quantum mechanics was thereby reduced to the mathematics of Hilbert spaces and linear operators acting on them. For example, the uncertainty principle , according to which the determination of the position of a particle prevents the determination of its momentum and vice versa, is translated into the non-commutativity of the two corresponding operators.

Von Neumann's abstract treatment permitted him to confront the foundational issue of determinism versus non-determinism, and in the book he presented a proof that the statistical results of quantum mechanics could not possibly be averages of an underlying set of determined "hidden variables", as in classical statistical mechanics. In , Grete Hermann published a paper arguing that the proof contained a conceptual error and was therefore invalid.

Bell made essentially the same argument in Bub also suggests that von Neumann was aware of this limitation and did not claim that his proof completely ruled out hidden variable theories. Gleason's theorem of provided an argument against hidden variables along the lines of von Neumann's, but founded on assumptions seen as better motivated and more physically meaningful.

Von Neumann's proof inaugurated a line of research that ultimately led, through Bell's theorem and the experiments of Alain Aspect in , to the demonstration that quantum physics either requires a notion of reality substantially different from that of classical physics, or must include nonlocality in apparent violation of special relativity.

In a chapter of The Mathematical Foundations of Quantum Mechanics , von Neumann deeply analyzed the so-called measurement problem. He concluded that the entire physical universe could be made subject to the universal wave function.

John von neumann biography for kids

Since something "outside the calculation" was needed to collapse the wave function, von Neumann concluded that the collapse was caused by the consciousness of the experimenter. He argued that the mathematics of quantum mechanics allows the collapse of the wave function to be placed at any position in the causal chain from the measurement device to the "subjective consciousness" of the human observer.

In other words, while the line between observer and observed could be drawn in different places, the theory only makes sense if an observer exists somewhere. Though theories of quantum mechanics continue to evolve, a basic framework for the mathematical formalism of problems in quantum mechanics underlying most approaches can be traced back to the mathematical formalisms and techniques first used by von Neumann.

Discussions about interpretation of the theory , and extensions to it, are now mostly conducted on the basis of shared assumptions about the mathematical foundations. Viewing von Neumann's work on quantum mechanics as a part of the fulfilment of Hilbert's sixth problem , mathematical physicist Arthur Wightman said in his axiomization of quantum theory was perhaps the most important axiomization of a physical theory to date.

With his book, quantum mechanics became a mature theory in the sense it had a precise mathematical form, which allowed for clear answers to conceptual problems. Von Neumann entropy is extensively used in different forms conditional entropy , relative entropy , etc. Quantum information theory is largely concerned with the interpretation and uses of von Neumann entropy, a cornerstone in the former's development; the Shannon entropy applies to classical information theory.

The formalism of density operators and matrices was introduced by von Neumann [ ] in and independently, but less systematically by Lev Landau [ ] and Felix Bloch [ ] in and respectively. The density matrix allows the representation of probabilistic mixtures of quantum states mixed states in contrast to wavefunctions , which can only represent pure states.

The von Neumann measurement scheme , the ancestor of quantum decoherence theory, represents measurements projectively by taking into account the measuring apparatus which is also treated as a quantum object. The 'projective measurement' scheme introduced by von Neumann led to the development of quantum decoherence theories. Von Neumann first proposed a quantum logic in his treatise Mathematical Foundations of Quantum Mechanics , where he noted that projections on a Hilbert space can be viewed as propositions about physical observables.

The field of quantum logic was subsequently inaugurated in a paper by von Neumann and Garrett Birkhoff, the first to introduce quantum logics, [ ] wherein von Neumann and Birkhoff first proved that quantum mechanics requires a propositional calculus substantially different from all classical logics and rigorously isolated a new algebraic structure for quantum logics.

The concept of creating a propositional calculus for quantum logic was first outlined in a short section in von Neumann's work, but in , the need for the new propositional calculus was demonstrated through several proofs. For example, photons cannot pass through two successive filters that are polarized perpendicularly e. The reason for this is that a quantum disjunction, unlike the case for classical disjunction, can be true even when both of the disjuncts are false and this is in turn attributable to the fact that it is frequently the case in quantum mechanics that a pair of alternatives are semantically determinate, while each of its members is necessarily indeterminate.

Consequently, the distributive law of classical logic must be replaced with a weaker condition. Nevertheless, he was never satisfied with his work on quantum logic. He intended it to be a joint synthesis of formal logic and probability theory and when he attempted to write up a paper for the Henry Joseph Lecture he gave at the Washington Philosophical Society in he found that he could not, especially given that he was busy with war work at the time.

During his address at the International Congress of Mathematicians he gave this issue as one of the unsolved problems that future mathematicians could work on. Later with Robert D. Richtmyer , von Neumann developed an algorithm defining artificial viscosity that improved the understanding of shock waves. When computers solved hydrodynamic or aerodynamic problems, they put too many computational grid points at regions of sharp discontinuity shock waves.

The mathematics of artificial viscosity smoothed the shock transition without sacrificing basic physics. Von Neumann soon applied computer modelling to the field, developing software for his ballistics research. Kent, the director of the US Army's Ballistic Research Laboratory , with a computer program for calculating a one-dimensional model of molecules to simulate a shock wave.

While not as prolific in physics as he was in mathematics, he nevertheless made several other notable contributions. His pioneering papers with Subrahmanyan Chandrasekhar on the statistics of a fluctuating gravitational field generated by randomly distributed stars were considered a tour de force. Taub and Veblen extending the Dirac equation to projective relativity, with a key focus on maintaining invariance with regards to coordinate, spin , and gauge transformations, as a part of early research into potential theories of quantum gravity in the s.

Von Neumann founded the field of game theory as a mathematical discipline. It establishes that in zero-sum games with perfect information i. Von Neumann showed that their minimaxes are equal in absolute value and contrary in sign. He improved and extended the minimax theorem to include games involving imperfect information and games with more than two players, publishing this result in his Theory of Games and Economic Behavior , written with Oskar Morgenstern.

The public interest in this work was such that The New York Times ran a front-page story. Von Neumann's functional-analytic techniques—the use of duality pairings of real vector spaces to represent prices and quantities, the use of supporting and separating hyperplanes and convex sets, and fixed-point theory—have been primary tools of mathematical economics ever since.

Von Neumann raised the mathematical level of economics in several influential publications. For his model of an expanding economy, he proved the existence and uniqueness of an equilibrium using his generalization of the Brouwer fixed-point theorem. In this model, the transposed probability vector p represents the prices of the goods while the probability vector q represents the "intensity" at which the production process would run.

Von Neumann's results have been viewed as a special case of linear programming , where his model uses only nonnegative matrices. The study of his model of an expanding economy continues to interest mathematical economists. Von Neumann's interest in the topic began while he was lecturing at Berlin in and Von Neumann noticed that Walras's General Equilibrium Theory and Walras's law , which led to systems of simultaneous linear equations, could produce the absurd result that profit could be maximized by producing and selling a negative quantity of a product.

He replaced the equations by inequalities, introduced dynamic equilibria, among other things, and eventually produced his paper. Building on his results on matrix games and on his model of an expanding economy, von Neumann invented the theory of duality in linear programming when George Dantzig described his work in a few minutes, and an impatient von Neumann asked him to get to the point.

Dantzig then listened dumbfounded while von Neumann provided an hourlong lecture on convex sets, fixed-point theory, and duality, conjecturing the equivalence between matrix games and linear programming. Later, von Neumann suggested a new method of linear programming , using the homogeneous linear system of Paul Gordan , which was later popularized by Karmarkar's algorithm.

Von Neumann's method used a pivoting algorithm between simplices , with the pivoting decision determined by a nonnegative least squares subproblem with a convexity constraint projecting the zero-vector onto the convex hull of the active simplex. Von Neumann's algorithm was the first interior point method of linear programming. Von Neumann was a founding figure in computing , [ ] with significant contributions to computing hardware design, to theoretical computer science , to scientific computing , and to the philosophy of computer science.

The paper, whose premature distribution nullified the patent claims of Eckert and Mauchly, described a computer that stored both its data and its program in the same address space, unlike the earliest computers which stored their programs separately on paper tape or plugboards. This architecture became the basis of most modern computer designs.

He arranged its financing, and the components were designed and built at the RCA Research Laboratory nearby. Von Neumann recommended that the IBM , nicknamed the defense computer , include a magnetic drum. Von Neumann was the inventor, in , of the merge sort algorithm, in which the first and second halves of an array are each sorted recursively and then merged.

He also contributed to the development of the Monte Carlo method , which used random numbers to approximate the solutions to complicated problems. Von Neumann's algorithm for simulating a fair coin with a biased coin is used in the "software whitening" stage of some hardware random number generators. He justified this crude method as faster than any other method at his disposal, writing that "Anyone who considers arithmetical methods of producing random digits is, of course, in a state of sin.

Stochastic computing was introduced by von Neumann in , [ ] but could not be implemented until advances in computing of the s. Von Neumann's mathematical analysis of the structure of self-replication preceded the discovery of the structure of DNA. In lectures in and , von Neumann proposed a kinematic self-reproducing automaton. He designed an elaborate 2D cellular automaton that would automatically make a copy of its initial configuration of cells.

Considered to be possibly "the most influential researcher in scientific computing of all time", [ ] von Neumann made several contributions to the field, both technically and administratively. He developed the Von Neumann stability analysis procedure, [ ] still commonly used to avoid errors from building up in numerical methods for linear partial differential equations.

However, he was frustrated by the lack of progress with analytic methods for these nonlinear problems. As a result, he turned towards computational methods. From this work von Neumann realized that computation was not just a tool to brute force the solution to a problem numerically, but could also provide insight for solving problems analytically, [ ] and that there was an enormous variety of scientific and engineering problems towards which computers would be useful, most significant of which were nonlinear problems.

Garrett Birkhoff described it as "an unforgettable sales pitch". He expanded this talk with Goldstine into the manuscript "On the Principles of Large Scale Computing Machines" and used it to promote the support of scientific computing. His papers also developed the concepts of inverting matrices , random matrices and automated relaxation methods for solving elliptic boundary value problems.

As part of his research into possible applications of computers, von Neumann became interested in weather prediction, noting similarities between the problems in the field and those he had worked on during the Manhattan Project. However, given his other postwar work he was not able to devote enough time to proper leadership of the project and little was accomplished.

This changed when a young Jule Gregory Charney took up co-leadership of the project from Rossby. The approach is to first try short-range forecasts, then long-range forecasts of those properties of the circulation that can perpetuate themselves over arbitrarily long periods of time, and only finally to attempt forecast for medium-long time periods which are too long to treat by simple hydrodynamic theory and too short to treat by the general principle of equilibrium theory.

Positive results of Norman A. Phillips in prompted immediate reaction and von Neumann organized a conference at Princeton on "Application of Numerical Integration Techniques to the Problem of the General Circulation". Once again he strategically organized the program as a predictive one to ensure continued support from the Weather Bureau and the military, leading to the creation of the General Circulation Research Section now the Geophysical Fluid Dynamics Laboratory next to the JNWPU.

What could be done, of course, is no index to what should be done In fact, to evaluate the ultimate consequences of either a general cooling or a general heating would be a complex matter. Changes would affect the level of the seas, and hence the habitability of the continental coastal shelves; the evaporation of the seas, and hence general precipitation and glaciation levels; and so on But there is little doubt that one could carry out the necessary analyses needed to predict the results, intervene on any desired scale, and ultimately achieve rather fantastic results.

He also warned that weather and climate control could have military uses, telling Congress in that they could pose an even bigger risk than ICBMs. This is a maturing crisis of technology The most hopeful answer is that the human species has been subjected to similar tests before and it seems to have a congenital ability to come through, after varying amounts of trouble.

The first use of the concept of a singularity in the technological context is attributed to von Neumann, [ ] who according to Ulam discussed the "ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.

Beginning in the late s, von Neumann developed an expertise in explosions—phenomena that are difficult to model mathematically. During this period, he was the leading authority of the mathematics of shaped charges , leading him to a large number of military consultancies and consequently his involvement in the Manhattan Project. The involvement included frequent trips to the project's secret research facilities at the Los Alamos Laboratory in New Mexico.

Von Neumann made his principal contribution to the atomic bomb in the concept and design of the explosive lenses that were needed to compress the plutonium core of the Fat Man weapon that was later dropped on Nagasaki. He also eventually came up with the idea of using more powerful shaped charges and less fissionable material to greatly increase the speed of "assembly".

When it turned out that there would not be enough uranium to make more than one bomb, the implosive lens project was greatly expanded and von Neumann's idea was implemented. Implosion was the only method that could be used with the plutonium that was available from the Hanford Site. As a result, it was determined that the effectiveness of an atomic bomb would be enhanced with detonation some kilometers above the target, rather than at ground level.

Von Neumann was included in the target selection committee that was responsible for choosing the Japanese cities of Hiroshima and Nagasaki as the first targets of the atomic bomb. Von Neumann oversaw computations related to the expected size of the bomb blasts, estimated death tolls, and the distance above the ground at which the bombs should be detonated for optimum shock wave propagation.

The cultural capital Kyoto was von Neumann's first choice, [ ] a selection seconded by Manhattan Project leader General Leslie Groves. However, this target was dismissed by Secretary of War Henry L. On July 16, , von Neumann and numerous other Manhattan Project personnel were eyewitnesses to the first test of an atomic bomb detonation, which was code-named Trinity.

The event was conducted as a test of the implosion method device, at the Alamogordo Bombing Range in New Mexico. Based on his observation alone, von Neumann estimated the test had resulted in a blast equivalent to 5 kilotons of TNT 21 TJ but Enrico Fermi produced a more accurate estimate of 10 kilotons by dropping scraps of torn-up paper as the shock wave passed his location and watching how far they scattered.

The actual power of the explosion had been between 20 and 22 kilotons. Von Neumann continued unperturbed in his work and became, along with Edward Teller, one of those who sustained the hydrogen bomb project. He collaborated with Klaus Fuchs on further development of the bomb, and in the two filed a secret patent outlining a scheme for using a fission bomb to compress fusion fuel to initiate nuclear fusion.

The term von Neumann machine also refers to self replicating machines. Von Neumann proved that the most effective way large scale mining operations such as mining an entire moon or asteroid belt can be accomplished is through the use of self-replicating machines, to take advantage of the exponential growth of such mechanisms. He also engaged in exploration of problems in these fields: numerical hydrodynamics.

Von Neumann had a mind of great ingenuity and near total recall. He was an extravert who loved drinking, dancing and having a good time. He had a fun-loving nature with a great love of jokes and humor. Because the program and data cannot be accessed at the same time, the data transfer rate is smaller than the rate the CPU can work, limiting the effective processing speed of the CPU.

It is made to wait for the data it needs to be moved to or from memory. The Von Neumann bottleneck can be avoided if data and program are transferred on separate buses. This is achieved by the Harvard architecture.