Introducing the divinity of the Universe
Contact us: Click to email
|
||||||||||||
Chapter 19: Quantization: the mathematical theory of communicationSynopsisThe Universe comprises many particles ranging in size from elementary particles to galaxies. From a communication point of view these particles are sources, entities capable of carrying, sending and receiving messages. The unity of the Universe is maintained by communication between these sources in visible classical networks organized by size (see Chapter 22: Network, cooperation and bonding). The stability of the system requires error free communication. Claude Shannon found that errors could be eliminated by structuring messages into packets or quanta. History shows that the progress of human science and technology depends upon rediscovering systems that the Universe has already discovered (often many times, like vision) through evolution. It is not therefore surprising to see that all communication in the Universe is quantized. Contents19.1: Communication19.2: The Shannon-Nyquist theorem for the digitization of continuous signals 19.3 Shannon's prescription for overcoming noise 19.4: Application Shannon's work 19.5: Entropy and information 19.6: Superposition and quantum representation of information 19.1: CommunicationThe standard model of quantum mechanics comes in two parts. The first deals with the evolution through time of undisturbed quantum systems. We imagine that this situation exists in the initial singularity, in isolated particles created by the initial singularity and in the Universe as a whole. This process is essentially invisible, so our knowledge of it is speculative. The second describes the evolution of systems when they are disturbed by measurement, that is by an interaction with another quantum system. Here quantum theory differs radically from classical theory. Classical physics assumes that it is possible to observe a natural system without changing it in any way. In contrast, quantum theory imagines that an act of communication unites particles in a common Hilbert space where they evolve together, at least temporarily. Both seeing and being seen require action. We will explore this second mode in Chapter 20: Measurement: the interface between Hilbert and Minkowski spaces. This chapter deals with ordinary classical communications in Minkowski space like phone calls and the internet. Here source includes both sender, message and receiver. Classical communication copies information from one point in spacetime to another. This may or may not involve the deletion of the initial information. The unavoidable flow of classical Minkowski space-time, illustrated by the light cone, shows messages emitted from a particular source can only be received in the forward light cone of that source. Conversely, a source can only receive messages from its past light cone. This structure is determined by the velocity of light, the maximum rate that physical information can be transmitted through Minkowski space by particles travelling on null geodesics. This is true from the point of view of particles and observers in Minkowski space, but in this book we are assuming that quantum mechanics describes a process underlying Minkowski space so that quantum mechanics operates in a world before the advent of space-time in the emerging Universe. Kinetic quantum processes prior to the emergence of spacetime can partly avoid this classical constraint through entanglement but this process cannot transmit specific information. Here we are necessarily looking at Hilbert space from our position in Minkowski space. We assume that the structure of the Universe is maintained by communications between its various components. If the Universe is to be stable, we further assume that these communications must be at least to some degree error free. The mathematical theory of communication developed by Shannon establishes that quantization and error prevention are very closely related.
On the basis of the discussion on Chapter 2: We assume that interactions between real physical particles in Minkowski space is one-to-one and that the relevant Hilbert space during the real time interaction of such particles is the tensor product of their individual Hilbert spaces. Tensor product - Wikipedia The so called measurement problem which arises from this situation is the subject of Chapter 20: Measurement: the interface between Hilbert and Minkowski spaces 19.2: The Shannon-Nyquist theorem for the digitization of continuous signalsContinuous modulated electromagnetic signals with limited bandwidth can be accurately represented as digital signals if they are sampled at twice the rate of their highest frequency. Since the upper limit of human hearing is about 20 000 Hz, digital audio is often sampled at 44 kHz. Nyquist-Shannon sampling theorem - Wikipedia In practical engineering terms, this enables every message and every signal to be identified as a point in a rational function space. Shannon provides and example of the size of the space necessary by estimating that one hour of a 5MHz television signal would be represented by a point in a space of 3.6 × 1010 dimensions. Claude Shannon (1949): Communication in the presence of noise
19.3: Shannon's prescription for overcoming noiseIn communication terms, the message to be sent is a point in message space and the signal transmitted is a point in signal space. The role of the transmitter is to encode or map the message into the signal. The receiver does the opposite. The computational procedure that does this work is called a coder-decoder or codec. The error control is embodied in this mapping. Codec - Wikipedia The key to the distinction of signals is: . . . two signals can be reliably distinguished if they differ by only a small amount, provided this difference is sustained over a long period of time. Each sample of the received signal then gives a small amount of statistical information concerning the transmitted signal; in combination, these statistical indications result in near certainty. The technique is to package or quantize the message and the signal to produce units extended in time which are clearly distinguishable: The transmitter will take long sequences of binary digits and represent this entire sequence by a particular signal function of long duration, that is a vector. The delay is required because the transmitter must wait for the full sequence before the signal is determined. Similarly, the receiver must wait for the full signal function before decoding into binary digits. The power of this approach is implicit in Shannon's Theorem 2: Let P be the average transmitter power, and suppose the noise is white thermal noise of power N in the band W. By sufficiently complicated encoding systems it is possible to transmit binary digits at a rate : C = W log2 (P + N) / N
with as small a frequency of errors as desired. It is not possible by any encoding method to send at a higher rate and have an arbitrarily low frequency of errors. He proves this theorem using geometric methods in rational function space. He concludes with a summary of the properties of a system that transmits without error at the limiting rate C, an ideal system. Some features of an ideal system are implicit in quantum mechanics, particularly quantization:
19.4: Application of Shannon's workThe difficulty with Shannon's theory lies in the phrase by sufficiently complicated encoding systems. In the seventy years since Shannon wrote, a large number of ingenious codecs have been developed which have established the internet as an all purpose transporter of all forms of information, open and secret. The advent of powerful digital computers capable of executing non-linear and non-topological algorithms has taken care of the sufficiently complicated aspect of the problem. In the error free world of formal mathematics, noise is not a problem. In the physical world, however, things may not be so clear cut. 19.5: Entropy and informationIn his Mathematical Theory of Communication Shannon framed the communication problem in terms of entropy, a concept derived from thermodynamics. Entropy - Wikipedia, Claude E Shannon (1948): A Mathematical Theory of Communication Shannon's paper begins with a definition of the entropy of a communication source. The entropy of a source A capable of emitting i different symbols or letters ai with probabilities pi such that Σi pi = 1 is: H = Σi pi log2 pi Quantum systems described by algorithms such as the Schrödinger equation are constrained to evolve through time in a unitary and reversible manner. The Schrödinger equation defines an error free communication channel which is nevertheless invisible to us. This process is interrupted when systems interact, just as computers in a network are interrupted when they are required to deal with an incoming message and the message and the computer begin a joint procedure. An important role of unitarity in quantum mechanics is to constrain the potential outcomes of quantum measurements so that they are a complete system of events identical to the output of a communication source as defined by the entropy equation above. Collectively exhaustive events - Wikipedia 19.5 Superposition and quantum representation of informationHilbert space is a vector space. All vectors are normalized so that when observed they retain the normalization of communication sources represented by the equation Σi pi = 1 above. The addition (superposition) of vectors produces a new vector whose direction is the sum of the directions of its components while its length remains 1. Information in Hilbert space is therefore carried by the direction of vectors. Since Hilbert space is linear, we can easily change the basis in which vectors are represented. This means that the directions of individual vectors are not significant but the angles between vectors are. The eigenvector of an operator is a vector whose direction or phase is unchanged by the operator. This value is observable, consistent with the idea that we can only observe stationary entities. The Born Rule extracts the distances between eigenvectors, say 1 and 2. These distances are complex probability amplitudes (see Chapter 13: The emergence of quantum mechanics). The squared probability tells us the probability of seeing eigenvector 1 if we look with eigenvector 2. Born rule - Wikipedia These probabilities associated with an interaction are discrete and precise. Their sum, 1, represents the outcomes of a complete set of events. There is no uncertainty associated with these values. The measurement process is uncertain insofar as it does not determine which particular eigenvalue will appear in the observation. We only have normalized probabilities to guide us. Nor is the precise timing of a particular event determined. Nature, like us, pronounces its words with precision but uses them in random sequences. By tradition this feature of quantum measurement has come to be called the 'collapse' of the wave function but a more reasonable explanation is simply that the measurement operation is an information source consistent with the demands of the theory of communication. This theory assumes that a communication source yields only one character of its alphabet at a time and the sum of the probabilities of observing these characters is normalized. Quantum mechanics, like the theory of communication, is not concerned with specific messages but rather the constraints on error free communication of all possible messages established by the statistics of a particular source. We assume that every quantum communication involves a quantum of action, analogous to a packet of data on a communication link. We return to this story in Chapter 20: Measurement—the interface between Hilbert and Minkowski spaces. |
Copyright: You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source. Notes and referencesFurther readingBooks
Links
|
|