The famous physicist Richard Feynman once said: “I think I can safely say that no one understands quantum mechanics.”
Some of the most puzzling topics in physics almost revolve around quantum theory. The most famous problems may be the “Schrödinger’s cat” and the problem of information loss during the evaporation of black holes. Most physicists are used to this. There is no doubt that quantum theory is successful on a practical level, but when quantum theory is no longer just regarded as a probability tool for calculating possible experimental results, but as a basic description of the “outside world”, serious conceptual problems will arise appear.
The most basic problem is that quantum theory seems to be only about what we measure, not about what exists in the world. Some people might think this is good, because this theory only represents our “information” about the world. However, this only makes sense when there is something in the world that we can be told; in general, such information must be specified by quantum theory.
According to quantum theory, the general state of the system (position or velocity of particles) has no clear value. This kind of uncertainty is called “quantum uncertainty”, also called “quantum fluctuation”. The quantum theory in standard textbooks involves two different rules for the evolution of the state of a physical system: one is the “U-process” mentioned by the British mathematical physicist Roger Penrose (Roger Penrose). The U process is represented by the Schrodinger equation. Given the current state of the system, it allows the system state to be accurately determined at any time in the future (deterministic prediction) or any time in the past (fully reversible). However, this rule only applies when the system has not been “observed”.
The second rule plays a role when certain properties of the system are observed or measured. It is a random rule, called “R-process” by Penrose. According to this rule, as a result of the measurement, the system state will jump to one of the states where the question attribute has a well-defined value. Generally speaking, this rule does not allow accurate prediction of the state that will occur, nor does it allow inversion of the state before measurement or observation. One can use the R process to accurately predict the probability and estimate the average value produced by a large number of repeated experiments, as well as the statistical dispersion of the results. The latter is numerically consistent with the above uncertainty level.
Another problem is that quantum theory has vague claims about the nature of the world without an observer. Does this theory require the participation of consciousness to make sense, and if so, does it include the consciousness of mice or flies? In particular, it should be pointed out that the specific elements of measurement in quantum theory are also very vague and almost irreparable. Maybe all we need is a large enough device, but how big is it big enough? What happens at the border? All these problems are called measurement problems. Such conceptual difficulties are often ignored by physicists in practice.
The famous physicist David Bohm provides an exception. He rediscovered a theory initiated by Louis de Broglie and gave it different characteristics. He believed that point particles have clear positions and speeds at all times, while quantum states only guide them to evolve over time. (And a cat will never be both dead and alive). Another notable exception comes from researchers who support modified quantum theory, which will unify the U process and the R process into a single rule, eliminating the need to introduce the concept of “measurement” at the basic level. In this case, the unfortunate Schrödinger cat will be either dead or alive, even if no one observes it.
▲ David Bohm
This method forms the basis of the theory of “spontaneous collapse”. The characteristic of these theories is to trigger a certain microscopic collapse in the entire space and time, similar to the spontaneous R process of all particles; that is, no measurement is required. More cutting-edge theories include many-worlds interpretation, proposed by Hugh Everett. In the multi-world interpretation, each measurement is linked to a branch (or multiple branches) of reality, which are similar to parallel worlds.
Careful analysis shows that these theories are essentially three possible logical ways to deal with the aforementioned problems: by adding something other than quantum states to modify quantum theory (de Broglie-Bohm theory in hidden variable theory); Measure events occurring at all times, and modify the state evolution rules in the theory (such as spontaneous collapse theory); or completely eliminate the R process (such as multi-world interpretation).
Many quantum physicists believe that this problem, or the methods people may take in this regard, has nothing to do with the challenges in their field, but there are also a few researchers who hold completely different views and believe that “spontaneous collapse” is the most relevant The path of prospects can solve some of the most serious difficulties encountered in understanding the laws of the universe, especially those situations that must involve both gravity and quantum theory.
Inflation and measurement
The study of inflationary periods is one of the central topics of cosmology. Scientists believe that inflation occurred very shortly after the Planck period. The Planckian period itself is very incredible and is considered to be the earliest time period in the history of the universe, from 0 to about 10^-43 seconds. In the Planckian period, quantum gravity should play a leading role, and the concept of space-time itself may no longer be relevant or useful (quantum gravity theory is a theory that harmoniously combines the basic principles of general relativity, gravity theory and quantum theory ). Under the mechanism of inflation, the usual concept of time and space is considered sufficient. Moreover, gravity is also considered to be well described by general relativity, while matter can be explained by the same kind of theories we use when studying conventional particle physics (for example, in the Large Hadron Collider of the European Nuclear Research Center). Experiments, or research on high-energy cosmic rays).
The main difference is considered to be that the dominant material (inflator) during inflation is in the so-called “inflation field.” The inflation field is a bit like an electromagnetic field, but it is much simpler because the inflator has no fixed direction or spin. The main feature of the inflationary period is that the universe is expanding extremely rapidly due to the gravitational action of the inflation field (the total expansion coefficient is at least a factor of 10^30). As a result, the space curvature of the universe is driven to zero, and all deviations from perfect uniformity and isotropy are completely diluted (the remaining 10^-90th order deviations, so small, can simply be taken to zero).
At the end of the inflationary period, the inflationary field decays, and the universe is full of all the matter that can be seen today: ordinary matter, which constitutes us, and the matter that makes up the earth and solar system; with the help of powerful particles from CERN With accelerators, scientists have produced some more peculiar matter within a fraction of a second; there are even elusive dark matter, which seems to constitute the vast majority of galaxies and galaxy clusters. In other words, the universe after the end of the inflationary period should fit the description of the earlier, more traditional, and more empirical Big Bang theory. At this time, an expanding universe is filled with thermal plasma composed of various particles, and their respective abundances are mainly determined by thermodynamic factors. The universe gradually cooled during the expansion process, forming a light core (the temperature dropped to 1 billion Kelvin); a long time later, the first atoms (about 3000 Kelvin) were formed. This latter stage corresponds to the photons released by the cosmic microwave background radiation.
In the small changes in the temperature pattern of the cosmic microwave background radiation, we can see imprints from the original deviations of uniformity and isotropy, which will continue to grow until now, and constitute the galaxies, stars and planets of our current universe. The point is that the universe is uneven and anisotropic for a long time. On the other hand, according to the inflation theory, the violent expansion of the universe completely dilutes all the inhomogeneities (the difference in different spatial conditions) and anisotropy (the difference between different directions). This situation is described by the space-time and inflation field in a completely uniform and isotropic state.
The inhomogeneities that lead to the formation of all cosmic structures, and the imprints we see in the cosmic microwave background, where do they come from? According to the current orthodox theory of cosmology, they originated from the “quantum fluctuations” and space-time metrics during inflationary periods. In fact, the field of a certain quantum state, the so-called “Bunch-Davies vacuum”, will also appear with inflation. This state, like the vacuum state in a flat space-time, has 100% uniformity and isotropy; but we should regard the quantum uncertainty of this state as the cause of the inhomogeneity of the universe today.
Most cosmologists do not see the problem at this point, because they can easily confuse “quantum uncertainty” and “statistical discreteness” (in both cases, the word “fluctuation” tends to obscure the conceptual deviation). However, this view is only reasonable when it comes to measurement. The point is that, according to the R process, the measurement may indeed change the state of the system, causing the system to no longer be as uniform and isotropic as the initial state.
So, what can be used as a measure in the early universe before the formation of galaxies, planets, and conscious life? Some cosmologists will answer that we are using satellites today to make the necessary measurements. After a little thought, we can discover the problem with this view: human beings and human measuring equipment are the reason for the perfect uniformity of the early universe, changing the formation of the universe’s structure (including galaxies, stars, planets, etc.), and these In turn, it is a necessary condition for life to appear (and call itself “smart”)! To some extent, we are the reason for our existence! This is reminiscent of an old country song, “I am my own grandfather”.
Spontaneous collapse
After considering the existing path to solve the “grandfather” problem, researchers such as Professor Daniel Sudarsky of the Institute of Nuclear Science of the National Autonomous University of Mexico proposed to add a new element: the quantum of the inflation field. Spontaneous collapse of state. This is a version of the R process, which happens continuously and usually results in small and random changes in the quantum state of the inflation field. The randomness of this process can explain the destruction of uniformity and isotropy in the early universe without the need to call any observers or measuring equipment. In addition, if the spontaneous collapse meets some simple requirements, the resulting predictions of these inhomogeneities can reproduce the temperature distribution characteristics seen in the cosmic microwave background.
▲ Spontaneous collapse may be a way to solve measurement problems, black hole paradoxes and other quantum problems
In the beginning, this new method did not seem to lead to any major deviations from standard forecasts. But at least in one aspect, the two forecasts are very different. The results show that, according to the standard processing method, the prediction of the uneven density of matter in the universe is inseparably linked to the similar prediction of the so-called relic gravitational wave. These gravitational waves are similar to the gravitational waves produced by the collision of black holes and/or neutron stars observed by the Laser Interferometric Gravitational Wave Observatory (LIGO) and the Virgo Probe (VIRGO). But the difference is that these primitive gravitational waves are now very weak, and their existence can only be detected in a specific type of anisotropy caused by the polarization of the cosmic microwave background radiation.
Physicists have always been keen on the study of gravitational waves of relics, because they believe that this may be the main evidence to prove the correctness of the inflation theory. However, so far, scientists have not detected the signals of these gravitational waves, which is also considered to be one of the serious problems facing inflationary cosmology. Due to the expected failure of detection, the simplest and most attractive models were also excluded.
When using the methods of researchers such as Sudarski, the predictions about the generation of primitive gravitational waves are significantly reduced, so that they will not be sensitively detected by current methods and detectors. The calculation results show that only when the sensitivity is greatly improved and the focus is converted from a very small angle to a very large angle in the sky, the relic gravitational wave can be detected. Unfortunately, both of these things are quite difficult to do. Therefore, quite unexpectedly, when Sudarsky and others began to think about this conceptually, the specific predictions of inflationary cosmology changed dramatically, and the new predictions and existing empirical evidence were more Consistent.
Black hole and quantum gravity
The conceptual difficulties of quantum theory are also related to black holes. General relativity predicts that once a black hole is formed, a singularity will develop inside it, that is, a region where a geometric quantity is nominally infinite, and the curvature will diverge as the region approaches. The nature of such singularities has aroused all kinds of speculation, and some people even believe that they represent the appearance of more strange objects, and may even be the entrance to other universes. However, what they really indicate is a mechanism that general relativity cannot apply.
In other words, if we want to apply general relativity, we must rely on a certain boundary, and this boundary excludes the areas where singularities should appear.
Physicists generally believe that our current theory should be replaced by a deeper theory, which includes general relativity and quantum mechanics, and combines them in a smooth and self-consistent way, that is, quantum gravity theory. This kind of quantum gravity is expected to “solve” those singularities and eliminate the need to include boundaries in discussions involving black holes. These concepts are the least speculative and do not involve entrances to other universes or other extremely strange objects that appear at the location of the singularity.
The physicist Jacob Beckenstein first pointed out a feature of black holes and took it as a basic clue that their exchange of energy with the outside is governed by some laws, which seem to be the same as the laws of thermodynamics. In particular, as Stephen Hawking has shown, black holes lose energy through the release of thermal radiation, and each unit area (side length is Planck length) required to cover the entire black hole region has one The entropy given by Mann’s constant (common in all thermodynamic systems). This view has aroused great interest in the academic community in the past few decades, because physicists began to consider various methods of constructing quantum gravity theory. Of course, this theory should explain the black hole entropy expression. Soon, in a relatively short period of time, and in a slightly different but always quite restricted context, proponents of quantum gravity found a relatively suitable explanation.
But in fact, this kind of analysis starting with Hawking’s discovery involves quantum theory and raises another problem that has always plagued physicists. This is the so-called “paradox” of black hole information, and it has always been the focus of intense debate and disagreement among physicists.
▲Black hole paradox: If a black hole evaporates completely, leaving only thermal radiation, then it seems impossible to encode all the information needed to reproduce the exact quantum state of the matter that originally produced the black hole
The usual explanation is this: According to quantum theory, the quantum state of an isolated physical system provides a complete description of the system. The evolution of this state relies on an evolution law that allows accurate prediction of the corresponding state at any other time in the future, or the inversion of the state of the system in the past. On the other hand, a black hole with a certain mass and angular momentum may be formed in many ways. If the black hole evaporates completely and only thermal radiation is left (a very simple method can be used to completely describe its characteristics), then there seems to be no way to encode all the information needed to accurately trace the quantum state of the matter that produced the black hole. Therefore, from the perspective of the details of the final state, it is impossible to invert the detailed state of the black hole when it was originally formed, which conflicts with the characteristics of the evolutionary law of quantum theory. For many people, this shows that we are facing a “paradox.”
A closer look at this issue will reveal that things are not that simple (this is also the reason why the word “paradox” is enclosed in quotation marks). The point is that according to quantum theory, the claim that we should be able to trace the detailed state of the black hole when it first formed is wrong. Only when people only pay attention to the U process and completely ignore the R process, will they come to such a conclusion. As a result, the thinking caused by the problems related to black hole evaporation and the fate of information is connected with the solution of the measurement problem.
For the measurement problem, one of the most attractive solutions is spontaneous collapse. Beginning in 2015, with the help of simplified models, Sudarsky and his colleagues carefully considered and analyzed whether such a theory can completely solve this problem in the context of black hole evaporation. So far, their analysis shows that the answer is yes, provided that the spontaneous collapse rate increases with the increase in the curvature of space. If this is the case, the elimination of the tiny levels of information normally associated with spontaneous collapse will become sufficiently efficient. This is caused by the increase in curvature deep inside the black hole. It also explains that when the black hole is completely evaporated, all the information seems to be Are all cleared.
Next, this work will continue to sort out the unsolved questions about the exact form of the theory, as well as the details, and find other situations where these concepts can be tested. Although the matter has not been resolved, there is a possibility that Schrödinger’s cat, black hole information problems, and some confusing problems in inflationary cosmology can all be solved by considering spontaneous collapse. Sudarsky and others have recently discovered that this method may help to answer other questions, including explaining why the initial state of the universe has very low entropy, and how to understand the nature and magnitude of dark energy. The use of spontaneous collapse theory in problems involving gravity also seems to be a promising and exciting research path.
For more such interesting article like this, app/softwares, games, Gadget Reviews, comparisons, troubleshooting guides, listicles, and tips & tricks related to Windows, Android, iOS, and macOS, follow us on Google News, Facebook, Instagram, Twitter, YouTube, and Pinterest.