Constants of Nature

Copyright © 1994 by Dr. Tienzen (Jeh-Tween) Gong

Constants of nature and the initial condition determine the structure of universe. Some constants of nature must be true constants; so the causal laws can arise and be preserved. But, at least one constant of nature must be a varying constant; so universe can evolve. But, why and how do constants of nature arise? This paper will explain it. Furthermore, this paper will introduce a new principle -- the corresponding principle which provides a new methodology to construct a new and complete physics.

I: Introduction

What are constants of nature? They are linking bridges between physical realities. In general, physics laws are expressed by setting one formula to be equal to another with a proportionality factor, such as:

physical variable = constant x mathematical formula.

So, constants do not change physics laws but alter the strength of them.
There are many ways to classify the constants of nature. I will divide them into three groups.

With this classification, only group C is non-reducible and thus fundamental. Only four constants of nature belong to group C, that is, 0 (cosmological constant), h (Planck constant), c (light speed) and G (gravitational constant).
Group A and B can be derived from group C. The electric charge can be expressed as;

the square root of h (planck constant) x c (light speed)

The mass of elementary particles arise from Weinberg angle which can be derived quite easily. I showed this procedure in my books many times.

[back to top]

II: Constant of Event Horizon

Planck constant h is the basis for the entire quantum physics. It forms a horizon for all actions. In physics, action is defined as energy (E) times time (t). According to the uncertainty principle, all actions (delta E x delta T) must be larger than or equal to h (Planck constant). In fact, the uncertainty principle has two expressions.

delta E x delta T >=h
delta P x delta S >= h
P is momentum, S distance, E energy and T time.

These two expressions actually form a viewing window. Let's imagine that E is the left edge of the window square, T the right edge, P the top and S the bottom. If we want to view more of what is behind the left by pushing the left edge further left (smaller delta E), we will inevitably lose information on the right (larger delta T). If we want to know more about what is behind the top by pushing the viewing window upward (smaller delta P), we will inevitably lose the knowledge we already had at the bottom (larger delta S). In short, the uncertainty principle guarantees that all viewing windows for information (such as: energy, time, momentum and position) must be finite in size. There is absolutely no way to enlarge this viewing window to an infinite size. In short, there is a horizon of knowledge. So Planck constant h can be called the horizon constant.

Although h (Planck constant) arose from quantum physics, it imposes an inherent horizon on all scales, and that improved techniques will not and cannot remove that limitation. The quantum world does not exist only in atomic or subatomic region but encompasses the entire universe. Not only the early stages of the universe (such as: the hadron era, the lepton era and the radiation era) were ruled by quantum physics, but today it still is the ruler of the cosmos. The nuclear fusion which gives life to stars is a quantum phenomenon. The evaporation process of black holes is caused by quantum laws. Many physicists still believe that the motion of billiard balls is not affected by quantum uncertainty, that is, the marco-world is not ruled by quantum physics. The stars, the galaxies and the superclusters are slightly bigger than billiard balls in size. If we want to measure the position of a galaxy, we must view its light as particles, so it can register its position on the photographic plate. If we want to know how fast that galaxy is moving, we must view its light as waves; so we can measure its red-shift. In short, the motion (position, distance, velocity, mass, etc.) of those cosmic objects which are slightly bigger than billiard balls can only be revealed with the essence of quantum phenomenon -- the dual nature of photons.

The ability to observe the cosmos is, indeed, limited by quantum indeterminacy. Even on a pitch dark night, there are still 600 noisy photons per square inch per minute in the atmosphere. for distant galaxies, their lights arrive at earth at a rate of 1 or 2 photons per square inch per minute. If we want to observe those large scale objects in the sky, the noise-signal ration is 600 to 1 in favor of the noise. Sure, there is still a way to filter out those noisy photons, but this example does demonstrate that the inherent horizon of knowledge is a reality both in mirco- and marco-world and both in practice and in principle.

[back to top]

III: Constant of causality

The viewing window of geodesic survey is also finite in size. By moving that viewing window one square at a time, we can eventually map out the entire Earth. Perhaps, someone will argue that by moving the quantum window (h) we also can map out the entire universe. Can we?

The fastest way of moving that quantum window (h) is with light speed (c). Thus, there is a new window with a size of h times c (hc). Can this new window encompasses the entire universe?
The answer is NO. There are three possible geometry for universe -- open, flat or closed. Both open and flat universes are infinite in size; hc being finite, it can never cover those universes. A closed universe has a finite size. Can hc encompasses a closed universe?

Again, the answer is NO. The idea that one could go right round a closed universe (having finite size) and end up where one started doesn't have any practical significance, because it can be shown that the universe would recollapse to zero size before one could get round. We would need to travel faster than light in order to end up where we started before the universe came to an end -- and that is not allowed according to the relativity theories.

This fact that h and c cannot encompass the entire universe gives rise to two issues, the event horizon and the non-locality. Actually, hc forms a causality box. Everything in this hc box can make causal contact among one another. Perhaps you already noticed that the square root of this hc viewing window is electric charge. Only electric charges generate photons and neutrinos by leptons. Photons and neutrinos provide us the information of this universe. Without photons, we will never know the existence of stars, galaxies, pulsars, quasars, etc.. Without the primordial microwave, we will never know how the universe got started. The photons from those galaxies provide us the information about their ages, positions, chemical make-up, recessional speeds, and their apparent masses. With these information, we are able to understand the structure of universe.

So, on the one hand, h creates a horizon (limitation) of knowledge; on the other hand, hc provides us knowledge. The square root of hc is electric charge which generates photons. Then, photons set the upper limit on all causal contacts, that is, photons give rise to a causal world. In short, c (the speed of photon) is the constant of causality.

[back to top]

IV: Spooky Action

The definition of horizon is that there is something behind the horizon. While hc preserves causality and forms an event horizon, it in fact points out that non-causality must also be a reality. At the turn of the century, realism faced an insurmountable difficulty not from nominalism, idealism nor phenomenalism but because of the rise of quantum physics. The Copenhagen Interpretation (CI) described the quantum world with the principle of complementarity which consists of three parts.

  1. A "whole" must consist of two opposite parts.
  2. These two opposite parts must be mutually exclusive.
  3. These two opposite parts are complementary to each other.

This principle of complementarity was expressed as Heisenberg uncertainty principle in physics. In quantum physics, the entire universe was divided into two mutually exclusive but complementary parts. Then these opposite parts are paired together, such as position versus momentum, time versus energy, etc., and only one of the two parts can be truly known with high precision of accuracy by any type of consciousness not only in practice but also in principle according to CI. In short, CI permits all sorts of spooky action at distance to be realities.

Although Einstein conceded that the uncertainty principle is indeed real in practice, he insisted that it cannot be true in principle. In 1935, he with his colleagues came to denounce the Copenhagen Interpretation and to defend realism with an EPR (Einstein, Podolsky, Rosen) thought experiment. In EPR experiment, we are asked to imagine that two particles originate from a definite quantum state, and then move apart without interaction with anything else until we elect to measure or observe one of them. Since the quantum rules allow us, when the two particles are initially in a definite quantum state, to calculate their initial momentum, the EPR argument was that the individual momenta will be correlated even after the particles separate. After these two particles have moved apart greater than a space-like separation distance, where no causal connection can be made by light signal between the separated particles, then we can, argued by Einstein, measure the momentum (not position) of one particle to a precise accuracy according to the uncertainty principle, and this measurement will not and cannot disturb the momentum of the other particle because of the space-like separation between the two. Thus, the second particle's momentum can be calculated to a precise accuracy because of the momentum conservation law. Then we can measure the position (not momentum) of the second particle to a precise accuracy, and the position of the first particle can be calculated precisely. This means that we should be able to deduce both position and momentum for a single particle to a precise accuracy and again to arrive at an one-to-one correspondence between physical theory and physical reality. In short, realism triumphs, and the uncertainty principle and CI is simply baloney UNLESS a spooky action at distance is at work.

In 1982, EPR-like experiments were done by Alain Aspect and his colleagues Jean Dalibard and Gerard Roger. Their work confirmed that the spooky action at distance is in fact a reality.
Aspect experiments leads to the conclusion that the universe cannot be understood by the sum of its parts because the whole is utterly indivisible. In fact, all isolated entities can be assumed to have interacted at some point (such as at Big Bang) in the history of the cosmos. This fact is called non-locality. Thus, for a group of people (such as parents and children, or husband and wife) who shared a large number of quantum particles in their history, the telepathy is not only permitted by but a direct consequence of the fact that the spooky action at distance is a reality.

[back to top]

V: Constant of non-locality

Since the non-causality and non-locality is a reality, it must be represented by a physics law or a physical variable or a constant. General relativity is constructed in terms of light signal. On the one hand, it creates an event horizon; on the other hand, it cannot encompass the fact of non-locality. But, the hall mark of Newtonian gravity is that its force transmits in a manner of immediate (spooky) action at a distance. So, Newtonian gravity is not only still the ruler of all spaceships but could be the entity that represents the fact of non-locality.

Furthermore, since gravity is linked to the evolution of the cosmos, the gravitational constant must be a varying constant because universe is indeed evolving. The idea that the gravitational constant is a varying constant was first contemplated by Paul Dirac in 1938 when he discovered the cosmic coincidences (also called the Large Number Hypothesis). The importance of his cosmic coincidences is that those numbers determine the overall structure of the cosmos and that those coincidences link those numbers (which come from seemingly unrelated area of nature) into a unified whole. There are three unrelated numbers -- the age of the universe (cosmology), the electron timescale (the time that light requires to travel a distance equal to the classical electron radius, quantum world), and the gravitational fine structure constant (Gf, Newtonian physics) -- but they can be linked by a simple mathematical relation: The age of the universe measured in electron timescale is equal to the reciprocal of the gravitational fine structure constant.

From this simple coincidence, it is very clear that either the electron timescale or the gravitational fine structure constant must be varying because the age of the cosmos is constantly moving ahead. Dirac guessed that the electron timescale is a true constant but the gravitational fine structure constant must be a varying constant. His guess was correct.

Gf to be a varying constant is also demanded by super unification. In super unification, all four forces have an equal strength under certain conditions. The strength of those forces are determined by the corresponding fine structure constant. Thus, at least three of the four fine structure constants must be able to vary in order for all four to meet at a same point. Among those four fine structure constant, only electromagnetic fine structure constant (Ef) is solely defined with h and c, which are true constants, that is, Ef cannot vary under any circumstance (see graph below). Again, Gf must be a varying constant.

Only a varying constant can give rise to an evolving cosmos. furthermore, if the universe contains two disjointed causality boxes, these tow disjointed causality boxes must still be connected in some ways because they are still parts of the whole, although there is no causal contact between them. The only candidate which can connect these two disjointed causality (hc) boxes is gravity, that is, gravitational constant (Gf) is the constant of non-locality.

[back to top]

VI: Constant of Initial Condition

In 1916, Einstein discovered a major flaw in his General Relativity, that is, the universe must expand or contract according to his new theory. Then (about 13 years before Edwin Hubble discovered that universe is indeed expanding), Einstein believed that the universe must be static. Thus, he modified his new General relativity theory by adding a new so-called cosmological constant into his equation to balance the expanding or contracting forces. Later, he called this cosmological constant the biggest blunder of my life, because he missed the opportunity to make the greatest scientific prediction of all time: the universe is expanding.

Today, this biggest blunder of Einstein becomes a very important constant of nature. It must be exactly equal to zero, that is, there absolutely cannot be a cosmological constant.

If this cosmological constant is not zero, we wouldn't have had four big spacetimes that we could walk around in; they would be curled up into a point. But, the fact is that we do have a nice universe in which we are walking around and that cosmological constant is exactly equal to zero. The actual measurement of this cosmological constant is the best experimental determination of a zero quantity we have ever come up with. In short, cosmological constant defines the structure of cosmos. It is the boundary condition of universe.

I mentioned in my books many times that the initial condition of universe not only cannot be blown away by Big Bang but must become the boundary condition of cosmos. Since cosmological constant represents the boundary condition of universe, it must also be the initial condition.

The detailed argument for the initial condition of universe is based on the microwave background and HZP (Harrison-Zel'dovich-Peeble) spectrum. To find out that the initial condition of universe is zero (cosmological constant) makes all kinds of philosophical sense, that universe arose from Nothingness.

[back to top]

[To other topics]