Wednesday, July 28, 2021

Why do elementary particles have such strange looking mass ratios?



Why do elementary particles have such strange looking mass ratios?

Elementary fermions of four different values of electric charge are observed in nature. In units of the charge of the positron, these values are 0, 1/3, 2/3, 1. The positron has charge 1, the up quark has charge 2/3, the anti-down quark has charge 1/3, and the anti-neutrino has charge 0. Their respective anti-particles have same absolute value of the electric charge, but of the opposite sign [electron, anti-up, down, neutrino]. We hence say that charge is quantised in these discrete units? But why these and only these, and why is it quantised in the first place?

Each of these four particles has a corresponding second generation particle, and a third generation particle. The second and third generation copies have the same electric charge as its first generation relative. Thus the counterparts of the electron are the muon and the tau-lepton, each of them having charge minus one. The down quark has as its counterparts the strange quark and bottom quark, all having charge 1/3. The up quark's copies are called charm and top, and they all have charge 2/3. We see that the value of electric charge does not change across generations.

The only known difference between the three copies of  a particle is their mass. The electron has a mass of 0.5 MeV. The muon at about 105 MeV is some 200 times heavier than the electron, The tau lepton at 1777 MeV is about 3500 times heavier than the electron. 

The up quark, charm and top have respective masses of 2.3 MeV, 1275 MeV and 173210 MeV. The down, strange and bottom quark have respective masses of 4.7 MeV, 95 MeV and 4180 MeV.

Neutrinos are known to have a non-zero mass, but much smaller than the mass of the charged fermions, and the actual value of their masses is unknown. Let us leave them out of the discussion for now.

It is obvious that unlike electric charge, mass ratios appear strange and random, and show no apparent pattern. If we decide say to compare the various masses with respect to the up quark, and for simplicity take the square root of the ratio, we get the numbers 1, 1.4, 0.47, 6.4, 6.8, 23.5, 42, 274.3, 27.8 The elegant simplicity of quantised values of electric charge is lost. Is there a simple pattern to these mass ratios, or not, and how are we to find the pattern, in case there is one? This profound question has remained unanswered for decades after all these particles were found and their masses measured experimentally.

An answer may now have been found, and it comes from a very surprising and unexpected quarter. We started in a very different realm: addressing a foundational problem in quantum theory. Quantum mechanics is formulated on an external classical space-time but such classical elements should strictly not be a part of quantum theory. We should be able to describe quantum phenomena without referring to classical time. In one specific approach to finding such a description, it has been found that the theory must be  formulated in eight dimensions, which are labelled not by real numbers, but by eight dimensional numbers known as the octonions.

When we try to place fermions in such an octonionic space-time, we are in for a surprise. We are not allowed to assign arbitrary properties and quantum numbers to these particles. The space-time dictates that the electric charge must be quantised, precisely in the units 0, 1/3, 2/3 and 1, as observed. And the space-time also dictates that particles have anti-particles of opposite charge, and come in three generations. 

If the space-time dictates there are three generations, and if the only difference between the three generations is mass, the octonionic space-time must also determine mass ratios, just as it determines ratios of electric charge. Happily, it does. Just as there is a charge value associated with every particle, there is a mass number associated with every particle, called its Jordan eigenvalue. These Jordan mass numbers are shown in the table below (Ignore the numerical entry in the first column). The entries in the next three columns are the respective mass numbers.

These mass numbers are very simple and pretty. For the three generational copies of a particle, the middle value is its electric charge, and the other two values are symmetrically placed about the middle value, always departing from the middle by a factor of square root of 3/8. 

The theory dictates that mass ratios will be determined by these Jordan values, and these mass ratios are shown in the table below, scaled with respect to the down quark mass which is set as one. It can be easily verified that these simple fractions reproduce the strange pattern of mass ratios observed in nature! Mass is quantised, just as electric charge is, and the ratios are quite simply given, though not as simple as the charge ratios 1/3, 2/3 and 1. 

We believe that the mystery of the strange mass ratios has finally been solved. At the heart of the resolution lies the realisation that elementary particles are described by the octonions, as long suspected by several physicists.

Also:

Mass ratios and Majorana neutrinos:
Getting the correct mass ratios from the octonion algebra requires us to assume that the neutrino is a Majorana particle [i.e. its own anti-particle]. Assuming the neutrino to be a Dirac particle gives wrong mass ratios. We hence predict that Neutrinoless Double Beta Decay does occur in nature and will indeed be observed once adequate sensitivity is achieved in experiments.

Reference: https://www.preprints.org/manuscript/202101.0474/v4

Saturday, July 24, 2021

What do the interpretational problems of quantum mechanics have to do with the failed unification programme of string theory?

What do the interpretational problems  of quantum mechanics  have to do with the failed unification programme of string theory?

******************
The context of my present post is this beautiful video chat:

Steven Weinberg and Andrew Strominger in conversation [April, 2021] YouTube video https://youtu.be/PFJ46G8BflQ
Towards the end of this very watchable video, a member of the audience put a question to both of them:
In your opinion, is there some problem in our understanding of quantum mechanics [i.e. interpretational issues, measurement problem]. And if so, could this unsolved problem be holding up progress in theoretical high energy physics etc. ?
Weinberg: Yes there is a problem. Quantum mechanics gives a great importance to observers. It pre-assumes a quantum-classical divide so as to be able to make sense of the theory [classical apparatus, measurement]. A reductionist theory must not have to depend on its own limit for us to understand the theory. Rather, working from bottom up, the theory should be able to explain the classical properties of the measuring apparatus as a consequence of the theory itself, instead of having to assume them a priori without proof. Essentially, Weinberg is dissatisfied with Bohr's Copenhagen interpretation. Then he correctly points out that Everett had the same dissatisfaction and therefore came up with what became known as the Everett interpretation. Wave functions never collapse during measurements, and the universe is forever in a state of quantum superpositions of everything. Weinberg says in the conversation that he does not agree with / like the Everett interpretation either. Hence, according to him there is something missing in our understanding of quantum theory.
Strominger: No. There is no problem. We can calculate marvellously with quantum theory. The Lamb shift has been calculated to an unprecedented accuracy. No experiment has ever disagreed with quantum mechanics. As for the interpretational issues, these are just words. It is not physics. You could side with Bohr, or you could side with Everett. It does not make a difference. You can still do your excellent calculations with quantum field theory and predict the world. In other words, Strominger is saying: Shut up and calculate.

***************
These contrasting responses by Prof. Weinberg and by Prof. Strominger are noteworthy, and have a deep connection with the current status of string theory.
Fast forward to the present: "Strominger: It does not make a difference'. ?? Interestingly, it does!! And I try to say this as clearly as possible: The reason string theory has failed in its unification programme, and fails to predict the standard model despite being almost there, is because string theory adheres to the Everett interpretation of quantum mechanics. If string theory is modified a little bit so as to allow for [a dynamical implementation of] Bohr's Copenhagen interpretation and wave function collapse, it can predict the standard model, uniquely. So Bohr vs. Everett makes a huge difference. As big as the difference between success and failure. As I now try to explain.
It is well-known that string theory can be consistently formulated as a unified theory in a higher dimensional space-time [not four]. Ten space-time dimensions, to be precise. (Eleven, for M-theory). So far so good. However the universe we live in is four dimensional, not ten. Why do we not see the remaining six spatial dimensions? The answer proposed in string theory is that the extra dimensions are curled up, compactified, too small to be seen, say as small as the Planck length scale. This proposed solution ruins the theory. For it turns out there are a very large number of inequivalent ways of compactifying the extra dimensions, all of which produce different particle physics theories in four dimensions. String theory loses predictive power. Which compactification to use? In fact it is not clear whether the standard model is even there as one of the compactifications. As a consequence, as Strominger notes, the ambitious unification programme of string theory was over in the 1980s itself, in a couple of years after the excitement set in.
There is a different mechanism, other than compactification, for recovering a four dimensional space-time from a ten dimensional space-time. It requires us to preferentially and deliberately pick Bohr over Everett, and modify quantum mechanics a little bit [while still remaining consistent with all lab tests of quantum theory] and allow for a dynamically induced rapid collapse of the wave function in macroscopic systems. This is known as the Ghirardi-Rimini-Weber (GRW) mechanism of spontaneous localisation, and would happen very naturally in string theory too, provided we remove the restriction that at the Planck scale as well, the Hamiltonian of the theory must be self-adjoint. Instead, allow for the possibility that under suitable circumstances, evolution at the Planck scale can be non-unitary. Such a possibility is certainly not ruled out by current experiments.
How does this help with the compactification problem in string theory? When we say that the universe is four dimensional, what we mean is that classical objects in the universe live and evolve in four spacetime dimensions. Nobody can claim that quantum systems live in a four dimensional space-time!! A quantum system can well be thought of as living in ten space-time dimensions [as string theory does] even in today's universe, provided the support of its wave-function is non-vanishing only over microscopic distances. Here, microscopic does not mean Planck length. Microscopic can be as large as a micron, roughly before the classical-to-quantum transition takes place around an Angstrom, into the world of atoms, which are of course quantum.
Consider then a string theory type quantum field theoretic system living in a ten dimensional spacetime. Except that the dynamics is now given by the GRW modified quantum theory. and the Hamiltonian possesses an anti-self-adjoint part. When sufficiently many degrees of freedom living in 10D get entangled, the GRW mechanism of spontaneous localisation sets in, and the entangled system becomes classical. And now is the key point, in becoming classical, the entangled system descends from ten to four spacetime dimensions. The support of its wave-function over the extra six spatial dimensions is vanishingly small, smaller than Planck length - this has actually been proved. While the size of their extent in the 4D spacetime remains large. This way we have achieved effective dynamical compactification, or so to say, compactification without compactification. Quantum systems, including those in today's universe, continue to live in ten dimensions. The forces that curve the extra six spatial dimensions are precisely the internal symmetries of the standard model.
We have developed a unification theory in ten spacetime dimensions, very similar to string theory. Except that the dynamics is modified quantum dynamics. The theory has a very promising potential to unify gravity and the standard model, and to predict the values of the free parameters of the standard model (work in progress).
So dear Prof. Strominger 🙂, it matters: Bohr or Everett. These are not mere words; foundational questions of quantum mechanics are important. And now they are important in your own backyard 🙂 Prof. Weinberg is absolutely right on this count; sadly he is no longer with us to witness the unfolding of this story.
We can hence have a failed string theory and an unmodified quantum mechanics. Or we can have a successful string theory and a modified quantum mechanics. It confounds me that string theorists, extremely smart physicists though they are, do not get this. Why do they not consider that the problem is not with strings, but with quantum theory? I sincerely hope they change their mind.


Wednesday, June 30, 2021

On how not to emulate Einstein only *partially*

On how not to emulate Einstein only *partially*

Einstein’s quest for developing a mathematically beautiful and physically correct field theory, out of pure thought and nothing else, was successful in the discovery of the general theory of relativity. However, his pure thought based quest failed in the attempt to unify gravitation with electromagnetism. But this failure of Einstein has not prevented us from emulating him – searching for a theory of quantum gravity, and a unified theory of known interactions, based on pure thought, without there being any experimental evidence to support quantization of gravity,  or to support its unification with the other forces. We have tried this for more than half a century, but like Einstein, we have also failed. At least we have not succeeded thus far. And yet, on physical grounds we know that there must be a quantum gravity, as well as unification. 

Could the reason for our failure be that we have been emulating Einstein only `partially’?

In hindsight, we know good reasons why general relativity was successfully discovered. Maxwell’s electrodynamics was relativistic to start with – it in fact propelled the discovery of special relativity. Because Galilean-invariant Newtonian mechanics was inconsistent with electrodynamics. Once special relativity was discovered, Newtonian gravitation had to be made relativistic too; hence general relativity was inevitable. The same cannot be said about unifying electrodynamics and gravitation as a geometric theory. Because the world is not classical. Enter quantum theory. Electrodynamics must be quantized, for it to agree with experiments. There goes the Einstein-style unification. Attempts at unification must take quantum theory into account. However, Einstein was not satisfied with quantum theory, and believed it to be an approximation to a more general  theory. 

In trying to pursue Einstein style thought based unification  program, while at the same time ignoring Einstein’s  concerns about quantum theory, we are emulating him selectively – this could be risky.

Einstein objected to the spooky action at a distance, through the EPR argument on quantum non-locality. He was not saying that quantum theory allows superluminal signaling. Rather, he was saying that there was a quantum influence outside the light cone, which is not causal. And this meant  that either the quantum mechanical description of reality is incomplete, or that special relativity and its related description of space-time structure would have to be modified so as to make it compatible with quantum theory. Since Bell’s theorem rules out local hidden variable theories, and since quantum non-locality has been confirmed by experiments, it is indeed special relativity which needs a rethink, in the quantum context.

Einstein also objected to the occurrence of probabilities in a deterministic mechanical theory: God does not play dice, he famously said. 

Can we then, in our quest for a quantum theory of unification, in the Einstein style of pure thought, also  address his concerns about quantum theory? Why do we pick just one half of Einstein? Maybe emulating him all the way will pay dividends? 

If we decide to emulate the full Einstein, how shall we do it? In true Einstein style, we must look closely at the quantisation procedure. The world of classical dynamics works perfectly, almost, in the macroscopic domain: classical bodies and fields on a classical space-time. If we are to quantise, we must quantise everything in one go: matter, gauge fields, AND space-time degrees of freedom [special and general relativity, not just the latter]. Quantising only matter and gauge fields, and leaving spacetime classical, while very successful, is at the heart of the spooky action at a distance that bothered Einstein. If we quantise everything, we will get a pre-quantum, pre-spacetime theory. This is what removes the incompleteness of quantum mechanics, because it makes pre-spacetime compatible with pre-quantum theory.

Through this generalized quantisation, we have gone pre-. How do we recover the classical world of matter and space-time from the pre-theory? Spontaneous localization is the answer. Classical macroscopic bodies and classical space-time emerge *simultaneously*. We say that space-time arises from the collapse of the wave-function.

Those matter particles and gauge fields which do not undergo spontaneous localization must be described by the pre-quantum pre-spacetime theory. This is the world of elementary particles undergoing standard model interactions. Even if these interactions are at low energies, the pre-theory must be used, if we are to avoid the spooky action and the probabilities of quantum theory. The pre-theory has an IR sector. However, we can to an excellent approximation describe quantum systems by using not the pre-theory, but quantum field theory on a classical space-time background, as we conventionally do. The approximation consists of dropping the very tiny quantum correction to space-time, caused by quantum systems, and assuming  spacetime to be classical. This is our beloved quantum theory, the one Einstein correctly calls incomplete. It has spooky action at a distance. The pre-theory is also deterministic [though non-unitary]. It has no probabilities – these arise only in the approximate description.

Moreover, when we examine the pre-quantum pre-spacetime theory, we find evidence for the standard model symmetries. Maybe the pre-theory can explain things about the standard model which we do not otherwise understand.

Maybe it pays to emulate Einstein fully.


Wednesday, June 9, 2021

What is Trace Dynamics?

 Trace dynamics is quantisation, without the Heisenberg algebra.

1. Quantisation Step 1 is to raise classical degrees of freedom, the q and p, to the status of operators / matrices. A very reasonable thing to do.
2. Quantisation Step 2 is very unreasonable! Impose the Heisenberg algebra [q, p] = i \hbar Its only claim to fame is that the theory it gives rise to is extremely successful.
In classical dynamics, the initial values of q and p are independently prescribed. There is NO relation between the initial q and p. Once prescribed initially, their evolution is determined by the dynamics. Whereas, in quantum mechanics, a theory supposedly more general than classical mechanics, the initial values of the operators q and p must also obey the constraint [q, p] = i \har. This is highly restrictive!
3. It would be more reasonable if there were to be a dynamics based only on Quantisation Step 1. And then Step 2 emerges from this underlying dynamics in some approximation. This is precisely what Trace Dynamics is. Only step 1 is applied to classical mechanics. q and p are matrices, and the Lagrangian is the trace of a matrix polynomial made from q and its velocity. The matrix valued equations of motion follow from variation of the Lagrangian. They describe dynamics.
4. This matrix valued dynamics, i.e. trace dynamics, is more general than quantum field theory, and assumed to hold at the Planck scale. The Heisenberg algebra is shown to emerge at lower energies, after coarse-graining the trace dynamics over length scales much larger than Planck length scale. Thus, quantum theory is midway between trace dynamics and classical dynamics.
5. The moral of the story is that quantum field theory does not hold at the Planck scale. Trace dynamics does. QFT is emergent.
6. The other assumption one makes at the Planck scale is to replace the 4-D classical spacetime manifold by an 8D octonionic spacetime manifold, so as to obtain a canonical definition of spin. This in turn allows for a Kaluza-Klein type unification of gravity and the standard model. Also, an 8D octonionic spacetime is equivalent to a 10-D Minkowski space-time. It is very rewarding to work with 8D octonionic, rather than 10D Minkowski - the symmetries manifest much more easily.
7. Trace dynamics plus octonionic spacetime together give rise to a highly promising avenue for constructing a theory of quantum gravity, and of unification. 4D classical spacetime obeying GR emerges as an approximation at lower energies, alongside the emergent quantum theory.
8. How is this different from string theory? In many ways it IS like string theory, but *without* the Heisenberg algebra! The gains coming from dropping [q,p]=i\hbar at the Planck scale are enormous. One now has a non-perturbative description of space-time at the Planck scale.
The symmetry principle behind the unification is very beautiful: physical laws are invariant under algebra automorphisms of the octonions. This unifies the internal gauge transformations of the standard model with the 4D spacetime diffeomorphisms of general relativity. The automorphism group of the octonions, the Lie group G2, which is the smallest of the five exceptional Lie groups, contains within itself the symmetries SU(3)xSU(2)xU(1) of the standard model, along with the Lorentz symmetry. The free parameters of the standard model are determined by the characteristic equation of the exceptional Jordan algebra J_3(O), whose automorphism group F4 is the exceptional Lie group after G2.

Friday, June 4, 2021

Why a quantum theory of gravity is needed at all energy scales; not just at the Planck energy scale?

 

Why a quantum theory of gravity is needed at all energy scales, and not just at the Planck energy scale?

and

How that leads us to partially redefine what is meant by Planck scale: Replace Energy by Action.


We have argued earlier that there must exist a formulation of quantum theory which does not refer to classical time. Such a formulation must in principle exist at all energy scales, not just at the Planck energy scale. For instance, in today's universe, if all classical objects were to be separated out into elementary particles, there would be no classical space-time and we would need such a formulation. Even though the universe today is a low energy universe, not a Planck energy universe.
Such a formulation is inevitably also a quantum theory of gravity. Arrived at, not by quantising gravity, but by removing classical gravity from quantum theory. We can also call such a formulation pure quantum theory, in which there are no classical elements: classical space-time has been removed from quantum theory. We also call it a pre-quantum, pre-spacetime theory.
What is meant by Planck scale, in this pre-theory?
Conventionally, a phenomenon is called Planck scale if: the time scale T of interest is of the order Planck time TP; and/or length scale L of interest is of the order of Planck length LP; and/or energy scale E of interest is of the order Planck energy EP. According to this definition of Planck scale, a Planck scale phenomenon is quantum gravitational in nature.
Since the pre-theory is quantum gravitational, but not necessarily at the Planck energy scale, we must partially revise the above criterion, when going to the pre-theory: replace the criterion on energy E by a criterion on something else. This something else being the action of the system!
In the pre-theory, a phenomenon is called Planck scale if: the time scale T of interest is of the order Planck time TP; and/or length scale L of interest is of the order of Planck length LP; and/or the action S of interest is of the order Planck constant \hbar. According to this definition of Planck scale, a Planck scale phenomenon is quantum gravitational in nature.
Why does this latter criterion make sense? If every degree of freedom has an associated action of order \hbar, together the many degrees of freedom cannot give rise to a classical spacetime. Hence, even if the time scale T of interest and length scale L of interest are NOT Planck scale, the system is quantum gravitational in nature. The associated energy scale \hbar / T for each degree of freedom is much smaller than Planck scale energy EP. Hence in the pre-theory the criterion for a system to be quantum gravitational is DIFFERENT from conventional approaches to quantum gravity. And this makes all the difference to the formulation and interpretation of the theory. e.g. the low energy fine structure constant 1/137 is a Planck scale phenomenon [according to the new definition] because the square of the electric charge is order unity in the units \hbar c = \hbar LP / TP.
In our pre-theory, there are three, and only three, fundamental constants: Planck length LP, Planck time TP and Planck action \hbar. Every other parameter, such as electric charge, Newton's gravitational constant, standard model coupling constants, and masses of elementary particles, are defined and derived in terms of these three constants: \hbar, LP and TP.
In the pre-theory the universe is an 8D octonionic universe, as shown in the attached figure: the octonion. The origin e_0=1 stands in for the real part of the octonion [coordinate time] and the other seven vertices stand in for the seven imaginary directions. A degree of freedom [i.e. `particle' or an atom of space-time-matter (STM)] is described by a matrix q which resides on the octonionic space: q has eight coordinate components q_i where each q_i is a matrix. We have replaced a four-vector in Minkowski space-time by an eight-matrix in octonionic space: and this describes the particle / STM atom. The STM atom evolves in Connes time, this time being over and above the eight octonionic coordinates. Its action is that of a free particle in this same: time integral of kinetic energy, the latter being the square of velocity q-dot, where dot is Connes time. Eight octonionic coordinates are equivalent to ten Minkowski coordinates, because of SL(2,O) ~ Spin(9,1).
The symmetries of this space are the symmetries of the (complexified) octonionic algebra: they contain within them the symmetries of the standard model, including the Lorentz symmetry.
The classical 4D Minkowski universe is one of the three planes (quaternions) intersecting at the origin e_0 = 1. Incidentally the three lines originating from e_0 represent complex numbers. The four imaginary directions not connected to the origin represent directions along which the standard model forces lie (internal symmetries). Classical systems live on the 4D quaternionic plane. Quantum systems (irrespective of whether they are at Planck energy scale) live on the entire 8D octonion. Their dynamics is the sought for quantum theory without classical time. This dynamics is oblivious to what is happening on the 4D classical plane. QFT as we know it is this pre-theory projected to the 4D Minkowski space-time. The present universe has arisen as a result of a symmetry breaking in the 8D octonionic universe: the electroweak symmetry breaking. Which in this theory is actually the color-electro -- weak-Lorentz symmetry breaking. Classical systems condense on to the 4D Minkowski plane as a result of spontaneous localisation, which precipitates the electro-weak symmetry breaking in the first place. The fact that weak is part of weak-lorentz should help understand why the weak interaction violates parity, whereas electro-color does not. Hopefully the theory will shed some light also on the strong-CP problem.
May be an image of text that says "e 1 e3 es e"

Saturday, May 29, 2021

Why there must exist a formulation of quantum theory which does not refer to classical time? : Towards quantum gravity and unification

May 29, 2021


Why there must exist a formulation of quantum theory which does not refer to classical time?
and
Why such a formulation must exist at all energy scales, not just at the Planck energy scale.
Classical time, on which quantum systems depend for a description of their evolution, is part of a classical space-time. Such a space-time - the manifold as well as the metric that overlies it - is produced by macroscopic bodies. These macroscopic bodies are a limiting case of quantum systems. In principle one can imagine a universe in which there are no macroscopic bodies, but only microscopic quantum systems. And this need not be just at the Planck energy scale.
As a thought experiment, consider an electron in a double slit interference experiment, having crossed the slits, and not yet reached the screen. It is in a superposed state, as if it has passed through both the slits. We want to know,
non-perturbatively, what is the spacetime geometry produced by the electron? Furthermore, we imagine that every macroscopic object in the universe is suddenly separated into its quantum, microscopic, elementary particle units. We have hence lost classical space-time! And yet we must be able to describe what gravitational effect the electron in the superposed state is producing. This is the sought for quantum theory without classical time! And the quantum system is at low non-Planckian energies, and is even non-relativistic.
This is the sought for formulation we have developed, assuming only three fundamental constants a priori: Planck length L_P, Planck time t_P, and Planck's constant \hbar. Every other dimensionful constant, e.g. electric charge, and particle masses, is expressed in terms of these three. This new theory is a pre-quantum, pre-spacetime theory, needed even at low energies.
A system will be said to be a Planck scale system if any dimensionful quantity describing the system and made from these three constants, is order unity. Thus if time scales of interest to the system are order t_Pl = 10^-43 s, the system is Planckian. If length scales of interest are order L_P = 10^-33 cm, the system is Planckian. If speeds of interest are of the order L_P/t_P = c = 3x10^8 cm/s then the system is Planckian. If the energy of the system is of the order \hbar / t_P = 10^19 GeV, the system is Planckian. If the action of the system is of the order \hbar, the system is Planckian. If the charge-squared is of the order \hbar c, the system is Planckian. Thus in our concepts, the value 1/137 for the fine structure constant, being order unity in the units \hbar c, is Planckian. This explains why this pre-quantum, pre-spacetime theory knows the low energy fine structure constant.
A quantum system on a classical space-time background is hugely non-Planckian. Because the classical space-time is being produced by macroscopic bodies each of which has an action much larger than \hbar. The quantum system treated in isolation is Planckian, but that is strictly speaking a very approximate description. The spacetime background cannot be ignored - only when the background is removed from the description, the system is Planckian. This is the pre-quantum, pre-spacetime theory.
It is generally assumed that the development of quantum mechanics, started by Planck in 1900, was completed in the 1920s, followed by generalisation to relativistic quantum field theory. This assumption, that the development of quantum mechanics is complete, is not correct - quantisation is not complete until the last of the classical elements - this being classical space-time - has been removed from its formulation.
The pre-quantum, pre-spacetime theory achieves that, giving also an anticipated theory of quantum gravity. What was not anticipated was that removing classical space-time from quantum theory will also lead to unification of gravity with the standard model. And yield an understanding of where the standard model parameters come from. It is clear that the sought for theory is not just a high energy BSM theory. It is needed even at currently accessible energies, so at to give a truly quantum formulation of quantum field theory. Namely, remove classical time from quantum theory, irrespective of the energy scale. Surprisingly, in doing so, we gain answers to unsolved aspects of the standard model and of gravitation.
The process of quantisation works very successfully for non-gravitational interactions, because they are not concerned with space-time geometry. However, it is not correct to apply this quantisation process to spacetime geometry. Because the rules of quantum theory have been written by assuming a priori that classical time exists. How then can we apply these quantisation rules to classical time itself? Doing so leads to the notorious problem of time in quantum gravity - time is lost, understandably.
We do not quantise gravity. We remove classical space-time / gravity from quantum [field] theory. Space-time and gravity emerge as approximations from the pre-theory, concurrent with the emergence of classical macroscopic bodies. In this emergent universe, those systems which have not become macroscopic, are described by the beloved quantum theory we know - namely quantum theory on a classical spacetime background. This is an approximation to the pre-theory: in this approximation, the contribution of the said quantum system to the background spacetime is [justifiably] neglected.

Saturday, September 26, 2020

Towards unification of the four fundamental forces: The Aikyon Theory

SCEST21: Schrodinger's Cat, and Einstein's Space-time, in the 21st Century

A blogspot for discussing the connection between quantum foundations and quantum gravity

Managed by: Tejinder Pal Singh, Physicist, Tata Institute of Fundamental Research, Mumbai

If you are a professional researcher / student researching on these topics, and would like to post an article here with you as author, you are welcome to do so. Please e-mail your write-up to tpsingh@tifr.res.in and it will be uploaded here.


Keywords: Quantum foundations; Quantum gravity; Schrodinger's cat; Spontaneous collapse theory; 
Trace dynamics; Non-commutative geometry; Spontaneous quantum gravity; Classical general relativity; black holes, gyromagnetic ratio 




Saturday, September 26, 2020


Towards unification of the four fundamental forces

https://arxiv.org/abs/2009.05574

https://www.youtube.com/watch?v=uxdvergYNrg&ab_channel=TejinderSingh





The Aikyon Theory


[The word Aikyon derives from `Aikya' in Sanskrit, which means `oneness’. To not make a distinction between space-time and matter].


At the Planck scale, there is no distinction between space-time symmetry and internal symmetry. Physical space is eight dimensional non-commutative octonionic space. One can imagine it as a 2-D complex plane, where the real axis represents 4-D to-be-spacetime, and the imaginary axis represents 4-D to be internal symmetries. The aikyon is an elementary particle, say an electron, *along with* the fields it produces. We do not make a distinction  between the particle and the fields it produces. This is evident from the form of the action for an aikyon, shown below: variables with subscript B stand for the four known forces, and those with subscript F for any of the 24 known fermions of the three generations of the standard model. The Lagrangian is unchanged if B and F variables are interchanged. This is super-symmetry. And since the B-variables include both gravity and gauge-fields, there is a gauge-gravity duality.


The aikyon evolves in this 8-D space in Connes time. The aikyon is a 2D object, as if a membrane [2-brane]. Motion along the real axis is caused by gravity, along vertical axis by electro-colour force, and from real to imaginary by the weak force. Or we can just say, the aikyon moves in the 8D space under the influence of the unified force, given by the  B-variable in the action. 


There is one such action term for every aikyon in this space. Different aikyons interact by `colliding' with each other. The coordinates of this 8D space are the eight components of an octonion. Algebra automorphisms transform one coordinate system to another. These are the analog of general coordinate transformations of general relativity and internal gauge symmetries of gauge theories, and hence unify those concepts. The theory is invariant under 8D algebra automorphisms. And because the laws of motion are those of trace dynamics, this is already a quantum theory.