Chapter 26: An alternative to field theory?
Synopsis
The standard approach to a theory of the Universe assumes the existence of Minkowski space as the domain of physics. Quantum field theory accommodates special relativity by assuming the existence a set of hermitian operators defined at each point in spacetime. The alternative proposed here begins with an initial singularity, naked gravitation, formally identical to the structureless Christian God modelled by Aquinas. Abstract Hilbert space, the basis for quantum mechanics, develops in this singularity. The universal computational power of QM, analogous to the power of the Turing machine, enables us to imagine that all processes in the Universe, including the construction of Minkowski space, are executed in this Hilbert space. This scenario may avoid difficulties like the cosmological constant problem, the appearance of unphysical infinities and the difficulty with ontological interpretation of the theory noted by Kuhlmann. It also exploits the full power of quantum mechanics in Hilbert space described by John von Neumann. John von Neumann (2014): John von Neumann (2014)
Contents
26.1: Measurement and data collection
26.2: Horse first, or cart?
26.3: Does everything work by computation?
26.4: Observation: communications from the underworld
26.5: The representation of reality: theory and the entropy of explanation
26.6: Physics is an empirical foundation for theology
26.1: Measurement and data collection
In scientific investigations . . . it is permitted to invent any hypothesis, and if it explains various large and independent classes of facts, it rises to the rank of a well grounded theory. Charles Darwin (1875): The Variation of Animals and Plants Under Domestication
The last few centuries have seen a vast increase in our knowledge of our habitat, using the methods pioneered by Galileo: precise and detailed observation and measurement with carefully designed and constructed instruments to extend our senses.
Although field theory provides us with very accurate but difficult computations of the properties of fundamental particles known as the standard model, it has trouble with infinity. Feynman writes:
I think that the renormalization theory is simply a way to sweep the difficulties of the divergences of electrodynamics under the rug. I am, of course, not sure of that. Richard P. Feynman (1965): Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics
Another senior dissenter was Paul Dirac, who wrote in his last paper:
The rules of renormalization give surprisingly, excessively good agreement with experiments. Most physicists say that these working rules are, therefore correct. I feel that that is not an adequate reason. Just because the results happen to be in agreement with observation does not prove that the theory is correct. Peter Goddard (1998): Paul Dirac, The Man and His Work, page 28
My last witness is the philosopher Meinhard Kuhlman, who concludes a review of Quantum field theory with the words (2020):
In conclusion one has to recall that one reason why the ontological interpretation of QFT [Quantum Field Theory] is so difficult is the fact that it is exceptionally unclear which parts of the formalism should be taken to represent anything physical in the first place. And it looks as if that problem will persist for quite some time. Meinard Kuhlmann (Stanford Encyclopedia of Philosophy): Quantum Fields Theory.
26.2: Horse first, or cart?
When Planck took the first step toward quantum mechanics in 1900 classical physics still worked in the 3D Euclidean space and independent time familiar to Newton. In 1905 Einstein introduced special relativity, which works in the 4D Minkowski spacetime. He also proposed that the light waves described by Maxwell and the packets of energy described by Planck are real particles. Albert Einstein (1905c): On a heuristic point of view concerning the production and transformation of light
This led to a debate about whether the world is best represented by particles or waves. In 1913 Niels Bohr and in later Louis de Broglie 1923 showed that quantum observables represent standing waves or eigenvectors. Finally in 1932 John von Neumann published his Mathematical Foundations of Quantum Mechanics which established abstract Hilbert space as the natural home of quantum theory.
Standard QFT treats Hilbert space as a field on Minkowski space, which means that the quantum mechanical formalism in Hilbert space is subject to Lorentz transformations. This leads to trouble. The operators and vectors quantum mechanics are linear, which means that they can be freely added to one another by superposition.
On the other hand, operations in Minkowski space are based on the Pythagorean theorem and are quadratic, the variables are squared. A linear function can be broken into pieces which simply add together: a(x + y) = ax + ay. The quadratic functions don’t come apart so easily: a(x + y)2 = ax2 + 2axy + ay2, more complex. The trouble lies in the cross term, xy.
Here we try to simplify quantum theory and preserve its linearity by placing it beneath rather than on top of Minkowski space. This move frees the linear intelligence of quantum theory to play its role in evolution (see Chapter 14: Evolution and intelligence). Now, of course, we have to work out how to put special relativity back into physics! My idea is explained in Chapter 17: Gravitation + particles = Minkowski space where I see the metric of Minkowski space as a product of the properties of fermions and bosons.
26.3: Does everything work by computation?
Since Galileo’s time it has been generally agreed that mathematics is the language of physics and both disciplines have had a very fruitful creative relationship.
Early in twentieth century David Hilbert moved to free mathematics from its implicit connection with physics by promoting formalism. He treated mathematics as a symbolic game in which anything goes as long as it does not lead to a contradiction. Real infinities may not actually exist but, wth help of set theory, mathematical discussions about them may be logically consistent. Formalism (mathematics) - Wikipedia
Field theory tries to treat fields as continuous entities which nevertheless represent interactions between particles of zero size that interact at points. My principal problem with field theory is that I understand geometrical continuity to be a mathematical ideal that does not represent anything real.
I am with Aristotle who said things are continuous if they have ends in common. Classical computers meet this criterion (which I call logical continuity) by reading and writing from the same memory location. Quantum superposition works in the similar way, adding vectors linearly. In physics we assume all interactions are local through contact implemented by massless bosons which in Minkowski space can get from A to B along a null geodesic whose starting and ending points are coincident in spacetime.
Formalism led to paradoxes in mathematics which motivated writers like Whitehead and Russell to try to develop a purely symbolic foundation to establish mathematics as a branch of logic. This approach helped remove the paradoxes and provided language for Gödel to express his incompleteness theorems and Turing his theory of computation. Whitehead and Russell (1910): Principia Mathematica
This work established logical boundaries on formal mathematics which came as a surprise to Hilbert, who thought that consistent mathematics would be complete and computable. It also questions the applicability of mathematics to the real world. Is arithmetic embedded in bags of beans? Is quantum theory embedded in the visible world? Gödel's incompleteness theorems - Wikipedia, Universal Turing Machine - Wikipedia
26.4: Observation is communication with the underworld
The computational power of quantum theory suggests that all the action in the Universe happens in Hilbert space. When two billiard balls crack together or a bullet tears through a living brain, we must imagine that trillions of trillions of elementary particles are involved, communicating with one another to produce the macroscopic effect. What we see at our scale as macroscopic objects colliding is on the microscopic scale a complex process of conversation not much different from a conversation between people as discussed on page 20: Measurement—the interface between Hilbert and Minkowski spaces.
Each of these particles is endowed with its own Hilbert space which controls its behaviour. When particles are interacting the work is done in a shared memory known as the tensor product of the spaces belonging to each particle. The historical laboratory origins of quantum theory have led us to call this interaction observation or measurement. If we are thinking in theological terms, measurement is like revelation, the invisible divinity sending a message into the visible world. Tensor product - Wikipedia
A crucial difference is that there messages are not coming from an omnipotent and all seeing divinity in the heavens, but from minuscule local processes simply dealing with their local situations like drops of water in an ocean.
26.5: The representation of reality: theory and the entropy of explanation
In a court of law we are required to tell the truth, the whole truth and nothing but the truth. In mathematical terms, our speech and the reality we are talking about must be bijective, elements in our speech corresponding to each element in reality, I saw A shoot B, etc. In communication theory and thermodynamics, such correspondences between speech and reality would be said to conserve entropy.
Einstein wrote:
A theory is the more impressive the greater the simplicity of its premises, the more different kinds of things it relates, and the more extended its area of applicability. Therefore the deep impression that classical thermodynamics made upon me. It is the only physical theory of universal content which I am convinced will never be overthrown, within the framework of applicability of its basic concepts. Albert Einstein (2000): Thermodynamics - Wikiquote
Entropy, which began as a thermodynamic term, is the simplest measure in science. It is just a count states. Deterministic reversible processes in nature must conserve entropy so that no information is lost. Science is a species of court. Here we imagine the Universe starting from a structureless initial singularity. It has no internal states, and its entropy is zero. Entropy - Wikipedia
We expect a similar symmetrical relationship between explanation and reality in particle physics. Given the simplicity of the early universe, we are justified in seeking very simple symmetries. So at the heart of quantum electrodynamics and all the other fundamental theories we have the simple phase equations represented by the eigenvalue equation and the Born rule. Eigenvalues and eigenvectors - Wikipedia, Born rule - Wikipedia
Here I strike a problem with the Feynman rules. The magnetic moment of an electron appears to be defined in nature with absolute precision. Ideally we calculate its value with an infinite series of diagrams which we assume converge to the correct value. My question is, does nature sum this series in the very simple world of the electron, or does it know a trick, something like the way young Gauss is said to have summed the integers from 1 to 100? The Feynman rules look like a high entropy procedure for a low entropy job. Prof Alan J. Barr: Feynman diagrams, Martinus Veltman (1994): Diagrammatica: The Path to the Feynman Rules
This idea of reversibility between explanation and reality has implications for gravity. Gravity sees only energy and takes no notice of the actual particles or structures that possess the energy. It talks to everything equally. We might call it codeless communication. This simplicity qualifies it for consideration as the initial singularity (naked gravitation) in which the Universe has emerged.
As Einstein noted in his paper on the field equations of gravitation, they tell us nothing that was not already implicit in the special theory. He writes at the end of his paper:
With this, we have finally completed the general theory of relativity as a logical structure. The postulate of relativity in its most general formulation (which makes space-time coordinates into physically meaningless parameters) leads with compelling necessity to a very specific theory of gravitation that also explains the movement of the perihelion of Mercury. However, the postulate of general relativity cannot reveal to us anything new and different about the essence of the various processes in nature than what the special theory of relativity taught us already. Albert Einstein (1915): The Field Equations of Gravitation
Here we see the special theory and the Minkowski space which it occupies as a consequence of the quantum mechanical differentiation of particles into bosons and fermions. Minkowski space - Wikipedia,
26.6: Physics is an empirical foundation for theology
In our evolving world old ideas, like old species, keep reproducing themselves as long as their selective environment does not change faster than they can evolve. If they cannot keep up, their population falls, leaving room for new species, new ideas.
In Galileo's time, the Catholic Church used political rather than scientific means to maintain orthodoxy, but the environment changed. Galileo's telescopes meant that reasonable people could no longer hold that the Sun revolves around the Earth. He escaped by recantation, but the Church took a severe blow and in remains in retreat before the evolution of science.
It is still taking a last stand against modern ideas like the spiritual equality of women; it retains its commitment to autocracy and infallibility although power now springs from below rather than above. This is why dictators (the Pope?) fear democracy, people acting for their own interest.
Science also springs from below, from careful observations on the simple structures of the Universe. If we are to have a Universal theology, it must be rooted in personal experiences. Like the particles in a space without continuous fields, we must be autonomous, able to decide our own relationships with our local world. Imperial divine powers descending from the heavens can never have sufficient entropy to treat every particle in the Universe as an individual. Only the particles themselves can do that. This observation holds at all scales, from elementary particles to Galaxies.
|
Copyright:
You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.
Notes and references
Further readingBooks
Darwin (1875), Charles, and Harriet Ritvo (Introduction), The Variation of Animals and Plants Under Domestication (Foundations of Natural History), Johns Hopkins University Press 1875, 1998 ' "The Variation, with its thousands of hard-won observations of the facts of variation in domesticated species, is a frustrating, but worthwhile read, for it reveals the Darwin we rarely see -- the embattled Darwin, struggling to keep his project on the road. Sometimes he seems on the verge of being overwhelmed by the problems he is dealing with, but then a curious fact of natural history will engage him (the webbing between water gun-dogs' toes, the absurdly short beak of the pouter pigeon) and his determination to make sense of it rekindles. As he disarmingly declares, 'the whole subject of inheritance is wonderful.'.
Amazon
back |
Goddard (1998), Peter , and Stephen Hawking, Abraham Pais, Maurice Jacob, David Olive, and Michael Atiyah, Paul Dirac, The Man and His Work, Cambridge University Press 1998 Jacket: Paul Adrien Maurice Dirac was one of the founders of quantum theory and the aithor of many of its most important subsequent developments. He is numbered alongside Newton, Maxwell, Einstein and Rutherford as one of the greatest physicists of all time.
This volume contains four lectures celebrating Dirac's life and work and the text of an address given by Stephen Hawking, which were given on 13 November 1995 on the occasion of the dedication of a plaque to him in Westminster Abbey. In the first lecture, Abraham Pais describes from personal knowledge Dirac's character and his approach to his work. In the second lecture, Maurice Jacob explains not only how and why Dirac was led to introduce the concept of antimatter, but also its central role in modern particle physics and cosmology. In the third lecture, David Olive gives an account of Dirac's work on magnetic monopoles and shows how it has had a profound influence in the development of fundamental physics down to the present day. In the fourth lecture, Sir Michael Atiyah explains the widespread significance of the Dirac equation in mathematics, its roots in algebra and its implications for geometry and topology.'
Amazon
back |
Veltman (1994), Martinus, Diagrammatica: The Path to the Feynman Rules, Cambridge University Press 1994 Jacket: 'This book provides an easily accessible introduction to quantum field theory via Feynman rules and calculations in particle physics. The aim is to make clear what the physical foundations of present-day field theory are, to clarify the physical content of Feynman rules, and to outline their domain of applicability. ... The book includes valuable appendices that review some essential mathematics, including complex spaces, matrices, the CBH equation, traces and dimensional regularization. . . .'
Amazon
back |
Links
Albert Einstein (1905c), On a heuristic point of view concerning the production and transformation of light, ' The wave theory of light, which operates with continuous spatial functions, has proved itself splendidly in describing purely optical phenomena and will probably never be replaced by another theory. One should keep in mind, however, that optical observations apply to time averages and not to momentary values, and it is conceivable that despite the complete confirmation of the theories of diffraction, reflection, refraction, dispersion, etc., by experiment, the theory of light, which operates with continuous spatial functions, may lead to contradictions with experience when it is applied to the phenomena of production and transformation of light.
Indeed, it seems to me that the observations regarding "black-body" light, and other groups of phenomena associated with the production or conversion of light can be understood better if one assumes that the energy of light is discontinuously distributed in space.' back |
Albert Einstein (1915), The Field Equations of Gravitation, ' In two recently published papers I have shown how to obtain field equations of gravitation that comply with the postulate of general relativity, i.e., which in their general formulation are covariant under arbitrary substitutions of space-time variables. . . . With this, we have finally completed the general theory of relativity as a logical structure. The postulate of relativity in its most general formulation (which makes space-time coordinates into physically meaningless parameters) leads with compelling necessity to a very specific theory of gravitation that also explains the movement of the perihelion of Mercury. However, the postulate of general relativity cannot reveal to us anything new and different about the essence of the various processes in nature than what the special theory of relativity taught us already. The opinions I recently voiced here in this regard have been in error. Every physical theory that complies with the special theory of relativity can, by means of the absolute differential calculus, be integrated into the system of general relativity theory-without the latter providing any criteria about the admissibility of such physical theory.'
back |
Albert Einstein (2000), Thermodynamics - Wikiquote, ' "A theory is the more impressive the greater the simplicity of its premises, the more different kinds of things it relates, and the more extended its area of applicability. Therefore the deep impression that classical thermodynamics made upon me. It is the only physical theory of universal content which I am convinced will never be overthrown, within the framework of applicability of its basic concepts."
Albert Einstein (author), Paul Arthur, Schilpp (editor). Autobiographical Notes. A Centennial Edition. Open Court Publishing Company. 1979. p. 31 [As quoted by Don Howard, John Stachel. Einstein: The Formative Years, 1879-1909 (Einstein Studies, vol. 8). Birkhäuser Boston. 2000. p. 1]' back |
Born rule - Wikipedia, Born rule - Wikipedia, the free encyclopedia, ' The Born rule (also called the Born law, Born's rule, or Born's law) is a law of quantum mechanics which gives the probability that a measurement on a quantum system will yield a given result. It is named after its originator, the physicist Max Born. The Born rule is one of the key principles of the Copenhagen interpretation of quantum mechanics. There have been many attempts to derive the Born rule from the other assumptions of quantum mechanics, with inconclusive results. . . . The Born rule states that if an observable corresponding to a Hermitian operator A with discrete spectrum is measured in a system with normalized wave function (see bra-ket notation), then
the measured result will be one of the eigenvalues λ of A, and
the probability of measuring a given eigenvalue λi will equal <ψ|Pi|ψ> where Pi is the projection onto the eigenspace of A corresponding to λi'.' back |
Eigenvalues and eigenvectors - Wikipedia, Eigenvalues and eigenvectors - Wikipedia, the free encyclopedia, ' In linear algebra, an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by λ, is the factor by which the eigenvector is scaled.
Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed. Loosely speaking, in a multidimensional vector space, the eigenvector is not rotated.' back |
Entropy - Wikipedia, Entropy - Wikipedia, the free encyclopedia, 'In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number Ω of microscopic configurations (known as microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Under the assumption that each microstate is equally probable, the entropy S is the natural logarithm of the number of microstates, multiplied by the Boltzmann constant kB. Formally (assuming equiprobable microstates),
S = k B ln Ω . ' back |
Formalism (mathematics) - Wikipedia, Formalism (mathematics) - Wikipedia, the free encyclopedia, ' In foundations of mathematics, philosophy of mathematics, and philosophy of logic, formalism is a theory that holds that statements of mathematics and logic can be thought of as statements about the consequences of certain string manipulation rules.
For example, Euclidean geometry can be seen as a game whose play consists in moving around certain strings of symbols called axioms according to a set of rules called "rules of inference" to generate new strings. In playing this game one can "prove" that the Pythagorean theorem is valid because the string representing the Pythagorean theorem can be constructed using only the stated rules.' back |
Gödel's incompleteness theorems - Wikipedia, Gödel's incompleteness theorems - Wikipedia, the free encyclopedia, ' Gödel's incompleteness theorems are two theorems of mathematical logic that establish inherent limitations of all but the most trivial axiomatic systems capable of doing arithmetic. The theorems, proven by Kurt Gödel in 1931, are important both in mathematical logic and in the philosophy of mathematics. The two results are widely, but not universally, interpreted as showing that Hilbert's program to find a complete and consistent set of axioms for all mathematics is impossible, giving a negative answer to Hilbert's second problem.
The first incompleteness theorem states that no consistent system of axioms whose theorems can be listed by an "effective procedure" (i.e., any sort of algorithm) is capable of proving all truths about the relations of the natural numbers (arithmetic). For any such system, there will always be statements about the natural numbers that are true, but that are unprovable within the system. The second incompleteness theorem, an extension of the first, shows that such a system cannot demonstrate its own consistency.' back |
John von Neumann (2014), Mathematical Foundations of Quantum Mechanics, ' Mathematical Foundations of Quantum Mechanics by John von Neumann translated from the German by Robert T. Beyer (New Edition) edited by Nicholas A. Wheeler. Princeton UP Princeton & Oxford.
Preface: ' This book is the realization of my long-held intention to someday use the resources of TEX to produce a more easily read version of Robert T. Beyer’s authorized English translation (Princeton University Press, 1955) of John von Neumann’s classic Mathematische Grundlagen der Quantenmechanik (Springer, 1932).'This content downloaded from 129.127.145.240 on Sat, 30 May 2020 22:38:31 UTC
back |
Meinard Kuhlmann (Stanford Encyclopedia of Philosophy), Quantum Field Theory, ' Quantum Field Theory (QFT) is the mathematical and conceptual framework for contemporary elementary particle physics. In a rather informal sense QFT is the extension of quantum mechanics (QM), dealing with particles, over to fields, i.e. systems with an infinite number of degrees of freedom. (See the entry on quantum mechanics.) In the last few years QFT has become a more widely discussed topic in philosophy of science, with questions ranging from methodology and semantics to ontology. QFT taken seriously in its metaphysical implications seems to give a picture of the world which is at variance with central classical conceptions of particles and fields, and even with some features of QM.' back |
Minkowski space - Wikipedia, Minkowski space - Wikipedia, the free encyclopedia, ' By 1908 Minkowski realized that the special theory of relativity, introduced by his former student Albert Einstein in 1905 and based on the previous work of Lorentz and Poincaré, could best be understood in a four-dimensional space, since known as the "Minkowski spacetime", in which time and space are not separated entities but intermingled in a four-dimensional space–time, and in which the Lorentz geometry of special relativity can be effectively represented using the invariant interval x2 + y2 + z2 − c2 t2.' back |
Prof Alan J. Barr, Feynman diagrams, ' To calculate the probabilities for relativistic scattering processes we need to find out the Lorentz-invariant scattering amplitude which connects an initial state |Ψi〉 containing some particles with well defined momenta to a final state |Ψf 〉 containing other (often different) particles also with well defined momenta.
We make use of a graphical technique popularised by Richard Feynman1. Each graph – known as a Feynman Diagram – represents a contribution to M. This means that each diagram actually represents a complex number (more generally a complex function of the external momenta). The diagrams give a pictorial way to represent the contributions to the amplitude.' back |
Richard P. Feynman (1965), Nobel Lecture: The Development of the Space-Time View of Quantum Electrodynamics, Nobel Lecture, December 11, 1965: We have a habit in writing articles published in scientific journals to make the work as finished as possible, to cover all the tracks, to not worry about the blind alleys or to describe how you had the wrong idea first, and so on. So there isn’t any place to publish, in a dignified manner, what you actually did in order to get to do the work, although, there has been in these days, some interest in this kind of thing. Since winning the prize is a personal thing, I thought I could be excused in this particular situation, if I were to talk personally about my relationship to quantum electrodynamics, rather than to discuss the subject itself in a refined and finished fashion. Furthermore, since there are three people who have won the prize in physics, if they are all going to be talking about quantum electrodynamics itself, one might become bored with the subject. So, what I would like to tell you about today are the sequence of events, really the sequence of ideas, which occurred, and by which I finally came out the other end with an unsolved problem for which I ultimately received a prize.' back |
Tensor product - Wikipedia, Tensor product - Wikipedia, the free encyclopedia, ' In mathematics, the tensor product V ⊗ W of two vector spaces V and W (over the same field) is itself a vector space, endowed with the operation of bilinear composition, denoted by ⊗, from ordered pairs in the Cartesian product V × W to V ⊗ W in a way that generalizes the outer product.
Essentially the difference between a tensor product of two vectors and an ordered pair of vectors is that if one vector is multiplied by a nonzero scalar and the other is multiplied by the reciprocal of that scalar, the result is a different ordered pair of vectors, but the same tensor product of two vectors.
The tensor product of V and W is the vector space generated by the symbols v ⊗ w, with v ∈ V and w ∈ W, in which the relations of bilinearity are imposed for the product operation ⊗, and no other relations are assumed to hold. The tensor product space is thus the "freest" (or most general) such vector space, in the sense of having the fewest constraints.' back |
Universal Turing Machine - Wikipedia, Universal Turing Machine - Wikipedia, the free encyclopedia, 'Alan Turing's universal computing machine (alternately universal machine, machine U, U) is the name given by him (1936-1937) to his model of an all-purpose "a-machine" (computing machine) that could process any arbitrary (but well-formed) sequence of instructions called quintuples. This model is considered by some (for example, Davis (2000)) to be the origin of the stored program computer -- used by John von Neumann (1946) for his "Electronic Computing Instrument" that now bears von Neumann's name: the von Neumann architecture.
This machine as a model of computation is now called the Universal Turing machine.' back |
Whitehead and Russell (1910), Principia Mathematica, Jacket: 'Principia Mathematica was first published in 1910-1913; this is the fifth impression of the second edition of 1925-7.
The Principia has long been recognized as one of the intellectual landmarks of the century. It was the first book to show clearly the close relationship between mathematics and formal logic. Starting with a minimal number of axioms, Whitehead and Russell display the structure of both kinds of thought. No other book has had such an influence on the subsequent history of mathematical philosophy .' back |
|