Return to main page

 

Lodz, Poland, March 2011

Please copy this page to your computer. Then you will be able to check if my predictions come true.

Z. A. Nowacki

 Superluminal Physics 

 Zbigniew Andrzej Nowacki

Third time's a charm

My photo

 

Let us consider consequences implied by the existence of superluminal (i.e., propagated with speeds exceeding c, the speed of light in vacuum) signals in physics. At first sight, it might seem that Einstein's theory of special relativity can remain untouched, since our signal encapsulation principle allows Lorentz transformations to work normally. Actually, the matter is not so simple. If we have two theories such that a physical phenomenon leads to a contradiction in exactly one of them, we know that:

(a) The theories are different.

(b) The theory with contradictions is erroneous.

Let us note that at present there are even three theories based on Lorentz transformations. The first of them was developed just by Lorentz, who, until his death in 1928, did not accept special relativity. In his theory the transformations were not to be considered physical; rather they were mathematical aids to calculation. Therefore, Lorentz was able to explain merely one relativistic effect: length contraction, while Einstein's theory predicted also time dilation. On the other hand, the theory outlined in Quantum Nonlocality... is most complete, because it works with three main relativistic effects: Lorentz-Fitzgerald contraction, time dilation, and signal encapsulation . All the three theories satisfy the principle of relativity which a long time ago Poincare was talking about: In all inertial frames of reference the laws of physics have the same form.

One may say that special relativity consists of two components: Lorentz transformations and Minkowski space-time. The former are very well experimentally supported, while the latter reflects the assumption that nothing can travel faster than c. Furthermore, the existence of faster-than-light signals leads to contradictions. Now, the theory of Quantum Nonlocality... preserves Lorentz transformations, rejects the physical existence of Minkowski space-time, and recognizes signal encapsulation protecting us against anomalies implied by the superluminal transfer of information.

The liberation of information

So much for the conceptual and practical differences between theories based on Lorentz transformations. Now for Einstein's second theory, i.e., general relativity. Its essence can be briefly depicted in the following way:

(1) Nothing can travel faster than light in vacuum, also in the presence of a gravitational field.

(2) Thus the propagation of light determines the geometry of the space-time with gravitation.

(3) But the propagation of light depends on gravitation.

(4) Thus gravitation determines the geometry of the space-time.

(5) But gravity depends on the distribution of matter and radiation.

(6) Thus the distribution of matter and radiation determines the geometry (curvature) of the space-time.

Note that (1) is analogous to the fact that a mountain stream flows along the shortest path, which determines (is closely connected with) the geometry of the terrain (in the space-time light plays the role of water, whereas the greatest speed does of the shortest path, and in this way we get (2)). And if (1) is false, the whole reasoning fails.

More precisely speaking, in textbooks of general relativity one may find a proof that it is impossible to synchronize clocks in the presence of a non-uniform gravitational field. However, the proof is false because Einstein and others have assumed that the synchronization requires a transfer of energy, which in reality is inessential. Indeed, this process can be performed (cf. Section 10. of Quantum Nonlocality...) with the use of signals carrying solely information that has been, for the first time in the history of humanity, liberated from combination with energy. And just the freed information kills general relativity.

Let us recall that Einstein obtained (3) by predicting the deflection of light-rays emitted in a gravitational field. Hence he inferred correctly that his theory of special relativity could be valid only in the absence of gravitation. Einsteinian physics has to consist of two theories, without and with gravity, and just this is wrong. On the other hand, (3) ceases to be true if we substitute 'information' for 'light'. We see that our theory of Lorentz transformations is able to work, owing to signal encapsulation and superluminal signals, with gravity. In this manner, the gravitational field becomes a normal field similar, e.g., to the electromagnetic one. 

An important remark, not only for experts, is here in order. It is known that numerous efforts to formulate the theory of quantum gravity unifying quantum physics and general relativity have faced a major obstacle: Einstein's theory is, owing to (6), background independent (there is no a priori geometric structure assigned to the space-time, but all is determined a posteriori), whereas quantum field theory describes the dynamics of particles in flat Minkowski space-time, whence it is automatically background dependent. We see that in superluminal physics this problem disappears completely; gravitation and quantum phenomena underlying the experiments of Quantum Nonlocality... can be considered on a par.

Geometry of space-time

At this point you might say: I agree that superluminal signals give more opportunities, but I wonder whether they all are used by Nature. Could we send information with superluminal speeds and develop the theory of general relativity at the same time?

I am sorry, but this is impossible. There are many reasons for this; we will discuss here only one: Einstein's theory leads to a wrong geometry of space-time. Although by virtue of (6) it is determined a posteriori, there is a property that is assumed a priori. Namely, it has to be non-Euclidean because locally it is to be consistent with the geometry of Minkowski space-time. However, the latter cannot exist physically, even when limited to the vicinity of an event. Indeed, the experiments of Quantum Nonlocality... can be performed also in a small lab with an almost uniform gravitational field.

You may still ask if general relativity could be saved by the fact that our superluminal signals do not carry energy, and the Einstein equations do not contain any 'information tensor'. However, this changes nothing, because the most characteristic feature of the Minkowski space-time is that every punctual event has a future and past light-cone. Currently they vanish, since using the devices of Quantum Nonlocality... a human or computer is able to take an action after receiving a message transmitted with a velocity in excess of c. That is to say, a causal connection between any two events is physically possible, which yields Euclidean space-time. 

Finally, one might ask: What about the many successes of general relativity? Well, Newton's laws were at least equally successful, but at present nobody maintains that they are entirely correct. The thing is that even a single counterexample is sufficient to rebut a theory. On the other hand, we can never definitively confirm it, because there is always an infinite number of theories that explain the data obtained in a finite number of experiments. 

Of course, general relativity (like Newton's laws) can still be applied in some areas. However, we should begin to realize that it is not a genuine theory of Nature. That we now know for sure.

Over 300 years ago Newton assumed that a signal could be always transmitted from an earlier event to a later one.  Over 200 years later Einstein postulated that there was a maximal finite speed of signals. We had to wait another 100 years to formulate signal encapsulation principle . Each of the three approaches provides a different geometry of space-time.

Note that Newtonian and Euclidean space-times are distinct albeit very similar. In the former two simultaneous events cannot be causally connected, for otherwise we could get a contradiction. (The role of 'metric' in Newtonian space-time can be played by the speed of a signal between events; causal connection requires a finite value.) Only superluminal physics implements Euclidean space-time exactly (symmetric metric may be simple distance), because absolute simultaneity, owing to Lorentz transformations, does not exist. And signal encapsulation principle ensures consistency.

You might be surprised that Lorentz transformations play a major role in the construction of Euclidean space-time. Nevertheless, this is a fact. It means that, in a sense, time becomes equivalent to spatial dimensions. 

Tiny Bang

In the mid-nineteenth century mathematicians knew merely a few non-Euclidean geometries, and they were really fascinated by the new objects. The fascination ended at the turn of the nineteenth and twentieth centuries, when a new branch of mathematics, topology, was formulated. Mathematicians realized that there were infinitely many topological spaces which could be created practically on demand. (One of the fathers of topology was the mathematician and physicist Poincare, which may explain why he was so skeptical of Einstein's theory.) Euclidean spaces were no longer unique, but they gained another very important property: universality. This means that other spaces can usually be embedded in Euclidean ones. In physics this corresponds to the fact that there are infinitely many universes which are embedded in the Euclidean space-time of the whole Nature.

Each universe is originated by an event akin to an explosion. It is called 'the Big Bang' by people living in this universe (especially before one of them discovers superluminal signals). However, when they already know about the other universes, 'the Tiny (or Small) Bang' becomes a more appropriate name. Indeed, on a scale of the entire Nature it was an insignificant event.

Precisely speaking, the theory of the Big Bang and that of the Tiny Bang differ in only one parameter: degree of isolation. In the former it has to be constantly equal to 1 (which means the perfect isolation), whereas in the latter the parameter is able to change with time: it is growing from zero, but always remains less than unity. This is caused by the fact that no physical system can be completely isolated from the rest of the world (and this may be the most important law of physics).

The notion of the Bang is independent of general relativity; Einstein did not predict any Bang. However, it is an experimental fact, and if general relativity were true, we would be doomed to the Big Bang theory. (This follows from the occurrence of the energy-momentum tensor in the Einstein equations.) Therefore, only at present we can start to develop the Tiny Bang theory.

The problems of creation

According to the Big Bang cosmologies, the whole matter of the universe was created at one instant of time. We would like to emphasize that this could happen. We cannot even talk about a violation of the law of energy conservation, because prior to the creation there existed no universe, no observers, and no physical laws. Everything suddenly appeared at t = 0. It is also possible - as the theory of general relativity does - to exclude the initial instant from the set of bona fide physical instants.

Much more interesting is the description of what the universe was like immediately after its creation. One thing is certain: After t = 0 there cannot be any violation of energy conservation either. This implies that the energy of the Earth, Sun, and billions of galaxies was once concentrated in a very small volume. Nobody knows how it might look like. No one has proved that it is at all possible.

Note also that an exact counterpart of the Big Bang cannot be created and examined inside an accelerator. Indeed, otherwise the following inequality

energy_of_accelerator  +  energy_of_sun  <  energy_of_accelerator

should be satisfied (for the accelerator belongs to the universe as well), and this is obviously impossible.

In the Tiny Bang theory the solution of these problems is simple. As our universe is not an isolated physical object, the law of energy conservation must be applied cautiously. In reality, our universe could capture a lot of matter, including whole galaxies, from other universes. (The detailed mechanism of this process cannot be discussed here.) The Tiny Bang itself had a relatively small energy which will be able to be exactly measured. Very similar events can be synthesized in our accelerators. What is more, as in superluminal physics all negative instants of time are well-defined, we will be able to investigate what was the cause of the creation of our universe.

Tips for bettors

Recently betting on scientific questions has become popular. Physicists have been wagering good money on the positive results of experiments, especially carried out by them or being to confirm their theories. Unfortunately, none of them has won anything yet, while bookmakers have cleaned up.

One might ask why the bookies have the scientific intuition better than physicists. Of course, it must occur because that is how the firms are able to stay in business. They recruit experts in the field to generate initial odds, but, first of all, they observe life attentively and draw conclusions.

Let us consider, for example, the case of gravitational waves. In the sixties, attempts to discover them were initiated by a guy who even maintained that he had done it. However, observations repeated by a few independent research teams did not corroborate his findings. Then physicists stated that the sensitivity of their detectors was too low, but in the next decade they would discover gravitational waves for sure. Since that time they were writing the same in every decade. Some scientists have devoted their lives to a fruitless search for these waves. And all the time there are triumphant reports canceled after a certain time due to the lack of confirmation from other independent centers.

At present, on the Earth there are working several large and expensive detectors of gravitational radiation from outer space. I pity the young people that are employed there. My advice to them is as follows: Quit it and run away from there. I am sorry, but your efforts cannot succeed (as long as your research managers are honest scientists and nobody plays a trick on them, for Einstein's gravitational waves can be very easily simulated by using suitable sources of sound waves set sufficiently close to the detectors). This is simple: superluminal signals imply that there is no curved space-time, and the more it cannot wave. This could possibly be done by some gravity-aether (having a kind of infinite tensions), but not by space-time.

This does not mean that gravitational waves do not exist. Actually, billions of gravitons (quantum components of gravitational radiation) are incident on Earth in each and every millisecond. However, their properties are entirely distinct from those envisaged by (belonging to classical physics) Einstein's theory. The first detector recording gravitons will not be very sensitive, but it will be constructed differently than GEO 600, LIGO, LISA, VIRGO, TAMA 300, etc.

At the end of 2009 physicists were enthusiastic that the Large Hadron Collider (LHC), a powerful accelerator in the European laboratory for particle physics (CERN) near Geneva, was again run after a long-term breakdown caused by a magnetic failure. Paddy Power, a bookmaker based in Dublin, eagerly took bets on what the LHC would discover in 2010. The firm had a book on dark matter, dark energy, miniature black holes, and one more thing. The favorite was dark matter with odds of 11/10, which indicates that experts and bettors were almost sure of finding it. Unhappily, the year 2010 has ended and the LHC has discovered nothing, so Paddy Power has not paid a dime.

Cosmic darkness

The history of searches for dark matter is even longer than that for gravitational waves. We cannot see it, but we are able to infer its presence through indirect methods. Our current gravitation theory, general relativity (in this case equivalent to Newton's laws), enables us to determine the mass of a body by the motion of its satellites. Thus, already in the thirties, it was calculated that the mass of the Milky Way was far larger than the combined mass of its constituent stars and interstellar gas. Later the same was done for other galaxies. Finally, astronomers have become convinced that the mass of galaxy clusters is far larger than that of their constituent galaxies. And all the missing mass has been recently termed dark matter.

If it indeed exists, we must ask ourselves what form it would take. Good candidates seem to be novel elementary particles, and just their detection is one of the primary goals of the LHC. However, it so happens that I know exactly what dark matter is. And I may say that for some reason the LHC will never be able to discover it.

Other dark-matter detection experiments are housed in deep mines. This makes some sense because the Earth is being continually bombarded by cosmic rays. Hence, on the surface we would not be able to distinguish many of the events they would create from the ones caused by dark matter. In this area there may be reported occasional positive evidences. Nonetheless, their authors will not even be able to recognize any properties of the recorded particles. I can confirm that this is not accidental, but they have nothing to do with missing mass. It's just Nature playing a game of cat-and-mouse with physicists. They will stay in these mines until I tell them what dark matter is. 

The observation that the expansion of our universe accelerates, instead of decelerating due to the mutual gravitational attraction of matter, indicates that the universe is filled by something that antigravitates. This unknown factor has been dubbed dark energy (whence by analogy the term 'dark matter' has been coined).

On the basis of general relativity, the simplest candidate for dark energy seems to be Einstein’s cosmological constant. Einstein introduced it into his equations in an ad hoc way, because he thought that otherwise the universe could not be static and would be contracted by gravity. After Hubble discovered the cosmic expansion and Friedman found his solution of the equations, Einstein dropped the cosmological constant and later he referred to it as the biggest blunder of his life. 

In our opinion, science should not be developed with ad hoc methods. The constant can be used in cosmology only if it is derived in a natural way from another theory. An attempt in this direction has been taken in quantum field theory, where a candidate for the constant could be the vacuum energy density. However, the obtained value is 10120 times greater than the observationally required one. Someone has even said that the discrepancy is 'the worst theoretical prediction in the history of physics'. Honestly speaking, quantum field theorists have stated that the most probable value of the vacuum energy density is zero, but if cosmologists desperately need something nonvanishing, they may take this huge number. And there is no possibility to decrease it.

Accurate observations have shown that six billion years ago the universe was not expanding with acceleration, i.e., the putative cosmological constant was equal to zero. It follows from this that physical laws allowing us to interpret it as a property of the vacuum had to be then different from the current ones. It does not seem that this was possible.

In this situation, to save Einstein's theory of gravitation, the cause of the cosmic acceleration is most frequently attributed (see, e.g., an article in "Nature" of April 2009) to fluids with negative pressure. For, according to the Einstein equations, gaseous pressures created by thermal molecular motions are the source of a gravitational field. The gravity interaction between normal matter and a fluid with sufficiently negative pressure is to be repulsive. So far nobody has seen such a fluid, but physicists speculate that some novel elementary particles discovered, maybe, by the LHC could supply the requisite negative pressure to speed-up the cosmic expansion.

I regret, but even this unknown state of matter cannot save general relativity. For the cornerstone of the theory is Einstein's principle of equivalence asserting that locally the effects of gravitation are equivalent to accelerative effects. And the existence of the dark-energy particles generating gravitational repulsion is at odds with the principle. The following proof of this fact is intended mainly for specialists in general relativity.

Suppose that we have two containers with positive and negative pressure correspondingly. If the containers are in freefall near the Earth’s surface, then they have to have opposite accelerations (because the former attracts the Earth, and the latter repels). Thus in the reference frame of one of them the acceleration of the other does not vanish. Hence an observer enclosed in an elevator or rocket falling in the gravitation field of the Earth (or another celestial object) is able to measure, performing a local measurement, the acceleration of the cabin. In particular, the occupant inside can distinguish between being weightless far out in space and being in freefall in a gravitational field.

We see that the gravitational repulsion is fatal to general relativity. From our point of view it does not matter, because we know already that Einstein's theory cannot be true owing to experiments given in Quantum Nonlocality... We have a different simple solution to the problem of the dark matter-energy, so the negative pressure is inessential. Therefore, in our opinion the LHC will not discover any dark-energy particles either.

Cure for black holes

According to general relativity, there are celestial objects possessing such powerful gravitational fields that once a star, planet, asteroid, rocket ship or even light passes through their surface (called event horizon), there is no way it can ever emerge. They have been termed black holes.

Now suppose that a rocket sent from the Earth has been equipped with containers of dark energy. Initially, their presence gives a small contribution to the weight, so the ship can pass through the event horizon of a black hole. After releasing ballast, dark energy begins to dominate, and then the rocket, repelled by the black hole, is able to return to Earth. But this means, of course, that there are no event horizons and no black holes.

An object regarded by astronomers as a black hole can continuously emit dark energy (by which, inter alia, its amount in the universe is steadily growing). In fact, since it is dark, they cannot observe this. We see that dark energy (even in the - so expected by many physicists - form of negative pressure) kills black holes. Therefore, I am sorry, but the LHC will not discover any miniature black holes.

Angels over the city

Sometime containers of dark energy will be really produced. They will be utilized for, inter alia, personal flights (angel-type wings will be used instead of bicycles) and the easy vertical takeoff and landing of aircrafts. This will radically reduce the number of disasters in passenger air traffic (e.g., no fog will disturb us, and at most the descent speed will be decreased). Moreover, small quantities of conventional fuel (or solar panels) will be needed only for horizontal movement. Let  us add that dark energy requires a little space and it is very safe, since it cannot explode, annihilate, radiate, pollute, etc.

Divine particles

Bookmakers have taken bets on one more thing connected with unknown subatomic objects. It is not very sophisticated; a spokesman for Paddy Power encouraged anyone still not sure of this thing to wager, just in case. But let us return to science.

On the basis of the Big Bang theory, the universe was created from nothing. Nevertheless, it would be hard to assume that immediately after the time 0 the properties of particles were the same as now. In particular, physicists believe that in the conditions nanoseconds after the Big Bang particles such as the ones that form your body were massless. To explain how they have obtained their masses, the theory of so-called Higgs particles has been formulated. According to it, a massless particle absorbs ('eats') a Higgs and gains mass, and in the process the latter becomes a 'ghost'. 

For the first time the Higgs particle was imagined in 1962. Since that time physicists, as in the case of gravitational waves, have been always claiming that it is on the verge of detectability and it will be found in the next few years. There were, of course, false-positive reports, and the failure to discover any Higgs-like particles was so frustrating that a researcher wrote a book about this. He wanted to entitle it "The Goddamn Particle", but his editor shortened the title. In this way the Higgs was dubbed the God or divine particle. The bookies have regarded this flashy name literally, and thus they are able to take bets on the existence of God.

Now let us examine how things are presented in the light of the Tiny Bang theory. As our universe is not an isolated object, even right after the time 0 there could pass electrons to it from other older universes. The particles were, of course, massive, and it would be very strange if they had been able to co-exist with our massless electrons. Thus we have to assume that our electrons (protons, W and Z bosons, etc.) were always massive as well. That is why in superluminal physics the divine manifestation is not anything we need. And I am very sorry again, but the detection by the LHC of a God particle is impossible.

The question arises whether the LHC will be able to detect anything at all. The answer is still positive. For instance, I also meant to wager, but it turned out that bets on my particle were not taken. This may be somewhat understandable, because its existence will destroy the Standard Model (a theory of elementary particles). Let us add that it belongs to a novel particle family from whom some members may be wrongly taken for a Higgs. The false-positive reports of its discovery arise precisely because some of my particles decay in the same way as the God particle. (For experts, not every electrically neutral and massive boson with baryon number and spin equal to zero is a Higgs. To see this, just find an analogous particle of spin 1.) 

However, it should be pointed out that there are some rules for the construction of huge accelerators. I doubt very much that the LHC engineers can know them. (With the same probability a monkey knows how a watch works.)  The only hope is that these rules apply to accelerators much more powerful than the LHC. 

 

Testament of Max Planck

This text should not be a surprise to anyone who knows the history of physics. In this field of science there were always radical breakthroughs. And benefits received from the new areas were much higher than losses incurred due to failed experiments.

Since the time of Newton, physics has focused on the notion of energy. Here we propose to base the field on the concept of information. Older researchers may fear this change, but younger ones should have no problem with it. I think that future physicists will perfectly know physics, computer science, and - last but not least - logic. This is probably good, because in our information society they will be very valuable professionals. 

Basing physics on the concept of information does not mean neglecting of energy-related matters. Actually, the effect should be just the opposite. For instance, in 75 years' time on every continent there will be merely one or two large power plants working on the basis of a physical phenomenon called AGA (I cannot decipher here the acronym). AGA has been unknown so far, although some of its symptoms (e.g., thunderstorms, hurricanes, the periodic occurrence of global warming and cooling, the emergence of life on Earth) are felt by all or (excess of antielectrons from space) by physicists in latest experiments. AGA is extremely strong, and at the same time it is gentle (for people who know it). AGA will not be able to be used to produce new bombs, but it can solve all our problems of energy shortage, environmental pollution, etc. And in case of emergency there will be no radiation. 

You may be wondering how I calculated the number 75. Well, if I had written the text five years ago (i.e., in March 2006), I would have estimated this time at 50 years. And each year lost now translates into five years of delay later.

I would like to mention that the delay has not been caused by me, but by people (I hope that one day their names will be made public) who decided not to publish my papers. The articles meet all standards required for scientific publications, while the referee's reports were swarming with errors. Let us consider the most important of them.

In 1980 three Italian theorists, usually referred as GRW, gave a 'general' proof of the impossibility of faster-than-light transmission of signals through quantum-mechanical measurement process. Since that time the result has been often quoted, just in the above form. As the paper of GRW is available in the Internet, I have analyzed it, and I have found that in fact they proved the following proposition:

No signal can be sent with any speed at using two quantum detectors.

But it is almost trivial, because to transmit and receive signals the receiver must somehow differ from the transmitter. This is the case of ear and larynx, eye and flashlight, radio set and radio beacon, etc. Therefore, instead of proving obvious theorems, Nature wanted us to find a way to diversify the role of Alice and Bob during a quantum transmission. Just this has been done in Quantum Nonlocality... and other works of mine.

When testing Bell inequalities, Alice and Bob's situation is symmetrical. Each of them has a quantum detector and performs measurements.  Both the subsystems depend on each other, but correlations between the results can be confirmed only statistically. That is, data from both participants of the experiment are gathered and fed by wires into a single computer which is able to analyze coincidences.

In our model Alice and Bob's situation is quite different. The latter has still a quantum detector which is very well suited for receiving signals, while Alice has a shutter which is only suited for transmitting signals. Indeed, Bob can do anything with his detector, but it will not cause any changes in the state of the shutter being, due to this, an independent subsystem. However, from the most fundamental laws of quantum physics it follows that the shutter state influences the results being obtained by Bob. On this basis he can infer whether the shutter was released or not. Thus Alice is able to send a faster-than-light signal to Bob.

On the other hand, in our approach Alice and Bob cannot test the Bell theorem, because its numerous versions have been proved with different assumptions. Restrictions of the range of applicability occur in all known physical theories, and therefore we should not be surprised that the GRW and Bell theorems fail or do not make sense for quantum experiments with one independent subsystem. Unfortunately, this remark does not apply to the referee, although he must have been an international expert holding the title of Professor of Physics.

Quantum Nonlocality... does not contain logical errors, is based on solely one new assumption, refers to the works of other authors, and satisfies the most important condition required for scientific papers in physics: it is experimentally verifiable. The results would be valuable even if they were negative. Indeed, in this case we would obtain a new excellent confirmation of Einstein's relativity theory. (But then quantum mechanics would have to be diametrically changed.) Thus the results should be published as quickly as possible in a physical journal, which will allow their verification by independent research teams.

Obviously, we are sure that quantum physics will emerge unscathed from the confrontation, and even it will become more important with new major applications. We think that Max Planck would have been pleased with this turn of events. We mention his name here, because there is a mysterious link between the author of this text and the creator of quantum theory. It looks like if someone has arranged it on purpose. The details are given on another page of the website.