The Grand Unification

The first universal law was developed in the late 1600s. Newton’s law of gravitation published in 1687 states that every object in the universe attracts every other object with a force proportional to their point masses and the square of their distance apart. This made a great impression upon the intellectuals of the era as it showed that for the first time, one feature of the entire universe could be accurately modelled mathematically. In the 1900s the theory of gravity was expanded to an even greater universal scale, when Einstein established his theory of General Relativity. Since then, other theories have been developed for the fundamental forces of nature. This includes theories for electricity and magnetism that were formulated over the 17th and 18th centuries and unified in the latter half of the period by James Clerk Maxwell. Higgs event.jpg Simulated Large Hadron Collider CMSparticle detector data depicting a Higgs boson produced by colliding protons decaying into hadrons and electrons
A few decades ago, electromagnetism was discovered to be unified with the weak nuclear force at such high energy levels that it could only be achieved in particle accelerators. It is believed that the strong nuclear force can also be united with the electroweak theory, provided experiments on earth can reach the extreme energy levels necessary. This combination of the strong nuclear, weak nuclear and electromagnetic forces is known as the grand unification. Some theories of everything go even further to bring gravity in as well. All these theories and their subsequent unifications indicate that perhaps there is a model that intricately describes and explains the whole universe with only a few mathematical laws((All online resources last accessed: 14/01/2017 Stephen Hawking & Leonard Mlodinow, The Grand Design, (Bantam Press pub. 2010), page 111)). Finding a complete form of this model has been, and still is, the driving force for many physicists, theoretical and experimental alike; the huge infrastructure and technology present at the CERN laboratory near Geneva is a testament to the significance of achieving this goal.
From historic experiments and their evidence, it is feasible to come to the conclusion that there will be future discoveries. These discoveries will bring greater unification and ultimately lead to a theory of everything. How did the above scientific milestones develop? Will the theories of modern physics eventually lead to a grand unification and the even greater theory of everything?((CERN, Unified Forces, (

Electricity and Magnetism: One and the same
Since Greek times there has been an awareness of two main materials that exhibit electrical or magnetic properties. Amber, a resin material from a type of tree, was known for its ability to attract an assortment of light objects such as fur or straw placed close to the amber. The other material was lodestone. This is a natural mineral which is a piece of magnetised iron ore which is also called magnetite, and was well known for its power to attract iron to it. Other than this, there was not any significant advancement in the study of these materials and their effects until the middle of the 1500s.
A number of natural philosophers (which are the equivalent to modern day scientists) started to document the interactions of the materials. The first ideas of conductors, insulators and charge were born from this period of experimentation. However, electricity and magnetism were still two very different fields of study and there was no indication of them being linked((Graham Hall, Maxwell\’s electromagnetic theory and special relativity, [Published 28 May 2008, The Royal Society,], Section 2. Early days)).
Our modern day theories of electromagnetism were developed over the late 1700s to the late 1800s. However, it wasn’t really until the end of the 1700s that the study of electricity and magnetism entered a new age. This was due to the invention of devices that could produce continuous electrical currents. Hans Christian Orsted, a professor in Copenhagen, was the first to realise there was a connection between electricity and magnetism, while preparing to give a university lecture. He noticed that a battery he was using affected a close-by compass needle; it was being repelled by the battery. Through this he had shown that if a compass needle was placed next to and aligned parallel to a wire, when an electric current was running through the wire, the compass would align itself to be at a right angle to that wire. So evidently, an electric current flowing through a wire induces a magnetic field at right angles to the direction of the current in the wire. An electrical current is in most cases a flow of electrons, so that means moving electrical charge produces the effect of magnetism((John C. Taylor, Hidden Unity in Nature’s Laws, (Cambridge Press Pub. 2001), page 80)). After further experiments, a clear relationship emerged: the strength of the magnetic field decreases inversely as the distance from the wire increased. Hence, doubling the distance would half the field strength and so forth. From the work of professors like Orsted, people realised that moving electrical charges could induce magnetism. Some also wondered about the opposite of this, whether a moving magnet could cause an electrical current. This group included British scientist Michael Faraday and in 1831 he made the discovery, and subsequently demonstrated, that a changing magnetic field does indeed produce a flow of electric current in conductors. A magnet moving through a coil of wire causes the magnet’s field lines to be cut by the wire at right angles, over time this produces an electric field which in turn causes a current in the wire. What is most impressive about Faraday is that he didn’t have a formal education in science. He grew up from a poor background. He taught himself science from books read from the shop he worked in and later became a laboratory assistant for the chemist Sir Humphrey Davy. As such, Faraday lacked any comprehension of advanced mathematics and continued to struggle with it later in life, one of his most striking attributes. This should have made it difficult for him to come up with a theoretical idea of the electromagnetic phenomena he observed during his experiments. Regardless, he did, which is where the idea of force fields and field lines stem from. Faraday imagined thin tubes occupying the space between magnets and electrical charges that did the pushing and pulling observed in induction.

Source: Iron filings are magnetic and are forced to lie along the magnetic field lines emanating from the magnet.

A typical demonstration of these ‘tubes’ involves a magnet and iron filings. Provided there is help to overcome friction, the iron filings can then move due to their magnetism to align themselves along the magnetic force field lines. In addition, years later in 1845, Faraday showed that light is also associated with electricity and magnetism. By utilising intense magnetic fields they were shown to affect polarised light, thus hinting at the fact that light is an electromagnetic wave.

This was the beginning of the unification of electromagnetism. It was far from being refined however, with 11 different theories existing for it, each one of them flawed in some way. Then throughout the 1860s, a Scottish physicist James Clerk Maxwell formulated Faraday’s thinking of field lines into a mathematical model. It accurately answered the relationship between electricity, magnetism and light that had previously mystified other physicists. Subsequently both electric and magnetic forces were defined by a single set of equations inferring them to be different forms of the same thing, the electromagnetic field. Thus Maxwell had unified magnetism and electricity into one force. Even more significant is that he explained how magnetic fields could travel through space in the form of an electromagnetic wave. The speed of that said wave was governed by a value that came from his equations. To the surprise of Maxwell, the speed of the wave he calculated was equivalent to the speed of light. The speed of light was known to an experimental degree of accuracy of 1% at the time. This implied that that light itself was in fact an electromagnetic wave. The importance of Faraday’s and Maxwell’s work is often taken for granted in the normal every day to day bustle. Maxwell’s equations are responsible for the workings of things from everyday household appliances to computers. They also describe all the different wavelengths of light on the EM spectrum((Stephen Hawking & Leonard Mlodinow, The Grand Design, (Bantam Press pub. 2010), pages 114-117)).
The Electroweak force: Unification at higher energies
After its successful unification, electromagnetism became known as a single fundamental force. This left a total of four of the most fundamental forces that govern the entire universe: the electromagnetic, gravitational, strong nuclear and weak nuclear. The search for even greater unification, by identifying similarities between these four forces, continued. It was in the 1960s when a promising chance to unify the electromagnetic and weak nuclear forces emerged. The weak force is responsible for particle interactions where particles change from one form to another through processes such as decay: I.e. when high energy particles decay into multiple lower energy ones.  The weak force acts over an extremely small range of 10-18 metres which is the same as 0.1% the diameter of a proton. Its relative strength is also a tiny value, being weaker than the strong force by a factor of 10-6 (compared with the strength of electromagnetism which is 1/137 times that of the strong force). At the energy levels of particles interactions we would experience on earth, the differences between electromagnetism and the weak force are very. The weak force can change particles into others such as when a neutron decays to a proton, electron and antineutrino, whereas the electromagnetic can’t. Furthermore the weak force has a very short range, as mentioned above, whereas the electromagnetic force acts over an infinite range, albeit dying off to negligible values at large distances.
Source: Simple diagram portraying the four fundamental forces
In the 1960s theoretical physicists devised that the weak and electromagnetic forces were actually two parts of a more complete theory. The two main theorists were an American named Steven Weinberg and a Pakistani called Abdus Salam, both came up with the idea independently of each other. It was also suggested that the strength of the weak force would increase as the energy of particles involved in the interaction increased. At energy levels approximately 100 times that of a proton’s rest energy, the weak force becomes similar to the electromagnetic one. This energy is around 10,000 times bigger than the energy that is seen in beta decay interactions. The reason for the dependence on energy levels for the electro weak unification is due to the weak force being short range.  Both the electromagnetic force and gravity act over infinite range, they do decrease in strength but only by following an inverse square law. The strong and weak nuclear forces are short range and consequently they have a radically sharp decrease (an exponential decrease) once outside of their respective range. Plans were put in place to test the predictions of the electroweak theories that had surfaced and it wasn’t until the 1980s that energies high enough to do so were achieved. By using the Super Proton Synchrotron laboratories at CERN a high energy beam of antiprotons were collided with a beam of protons travelling in the opposite direction. This mutual annihilation of the protons and antiprotons caused such high amounts of energy that new particles were produced for the first time; the W and Z bosons((CERN, The Z Boson, ( These particles were direct evidence that supported the electroweak theory that had previously predicted them. The W and Z bosons are the force carrier particles of the weak nuclear force and have masses of 86 and 97 times of that of a proton, respectively. For some of the lighter particle interactions (such as the decay of a muon to an electron), a carrier particle of such high mass as the W or Z simply isn’t feasible. That would mean that mass has to be produced from out of nowhere since there wasn’t enough to start with in the incident particle. Counterintuitively, this is possible, according to quantum mechanics and Heisenberg’s uncertainty principle. It is said that this extra mass can exist as long as it is only present for an extremely short time. This means that the bosons only have a short time to travel before decaying which explains why the weak force has short range.

It turns out, the strength of the weak force is essentially as strong as the electromagnetic force but it appears ‘weak’ because it has such a short range. However, this obstacle of limited distance is overcome when you reach high enough energy levels. Subsequently, the electromagnetic and weak forces are unified under these conditions as two different aspects of the same thing((John C. Taylor, Hidden Unity in Nature’s Laws, (Cambridge Press Publications 2001), pages 339-346)).

A Grand Unification and The Theory of Everything: is it possible?
As with the electroweak unification, it is thought that even higher energies may enable the strong force to be added into the theory. As of yet, this is not confirmed but more is being understood as technology improves allowing particle accelerators to reach higher and higher energy levels. In fact, it was recently discovered that for a very short amount of time, quarks can exist in a penta-quark structure. Quarks are the fundamental particles that make up baryons (which are made up of a three quark or three antiquark structure e.g. protons and neutrons) and mesons (made from a quark and an antiquark e.g. kaons and pions).  The strong force is the interaction responsible for this short range but high strength binding of quarks. It was previously believed that it was not possible for quarks to be isolated into a single lone quark or other forms; they were confined to their trios in baryons, or doublets that make up the mesons.

The reason for this is that tremendous amounts of energy need to be put into a proton to separate its quarks. However, there is so much energy present, it is sufficient for pair production of a quark and antiquark to occur according to the E = mc2 equation. This simply means that energy can be converted into mass, hence why very high energy collisions can produce many particles. Consequently quarks are unable to be isolated because the high energy being used simply becomes new particles and thus does not contribute towards splitting the quark trios. Over the next few decades many more discoveries are likely to be made. Perhaps accelerators will soon operate at the required energy levels to provide evidence to unify the strong force with the electromagnetic and weak forces. The incorporation of the strong force with electroweak theory is the Grand Unification. This theory would be able to describe nearly everything that happens on the very small quantum scale. That would then leave it up to the theories of everything with the much more challenging force to integrate: Gravity.
Source: This graph crudely conveys how at high energies, the theories predict the forces to merge and be different aspects of the same force. As we previously achieved with the electroweak theory.

Throughout the 1900s physicists experimented and theorised to create many of the great theories that we have now. These can all be categorised into two main areas. The first is Einstein’s theories of general and special relativity. They describe all of the massive objects in our universe such as stars, galaxies, and the force of gravity which is the influence behind nearly all of the cosmos.

Then at the subatomic scale is quantum mechanics which applies to atoms, their constituents and all the forces that are responsible for the interactions between them. These two theories are incompatible; you can’t use quantum physics to explain certain processes in the cosmos such as some of the mysteries that surround black holes. Likewise, Einstein’s relativity and gravity does not conform to the rules of quantum mechanics that govern particles. In the quantum realm, gravity is so weak and negligible when acting on particles that it isn’t even considered. A theory that attempts to resolve Einstein’s theory of gravity and the grand unification of the other three fundamental forces is what is known as M-Theory((Michael Duff, Theory of everything: The big questions in physics, (New Scientist 01/06/2011 This M-theory is rather ambitious and is not a single theory. It is an umbrella term for many different theories that use particular principles and ideas. The three main ideas utilised by M-theory are extra dimensions, supersymmetry (SUSY) and superstrings or membranes. Extra dimensions were introduced in the 1920s when a German physicist Theodor Kaluza merged Einstein’s gravity with Maxwell’s electromagnetism.

Kaluza revised Einstein’s gravity into five dimensions to give the gravitational field more components that could be interpreted as the electromagnetic field. Surprisingly these were shown to exactly match Maxwell’s equations provided you can accept a fifth dimension. This is somewhat confusing as it is evident there are only 4 dimensions: three for space and one for time. In 1926 a Swede called Oskar Klein came up with the solution by supposing that the fifth dimension is so small that it cannot be seen. A good analogy of this is to consider an ant on a piece of straight string. The ant is able to walk up and down the string and is also able to go around the circumference of the string onto the underside and back to the top again. A human observer standing above, looking down they would only see the string as a straight one dimensional line, unaware of the second dimension the ant can traverse. So Klein stated that the fifth dimension is much like this hidden one in the analogy. From this the Kaluza-Klein theory was made but it wasn’t considered significant during its time and was only rejuvenated many years later by the arrival of Super Symmetry (SUSY). SUSY is a take on the current standard model that predicts a whole new set of particles. The particles that make up matter are the hadrons and leptons which are particles that consist of quarks and fundamental particles like electrons or muons respectively. The particles that transmit force are known as carrier particles which are the bosons such as the W, Z bosons and virtual photons. These particle types are very different from each other so it was astonishing to theorists in the 1970s when it was shown that it was possible to form equations that were unchanged when you swapped the particle types. I.e. in SUSY, hadrons and leptons are the force carriers whereas bosons become the matter particles. This implies there is a new type of symmetry in nature hence the name supersymmetry. One Supersymmetry predicts that every particle has a supersymmetric partner particle. These particles are thought to be present only at incredibly high energy levels and as a result, particle accelerators still haven’t found any direct evidence of them existing yet((Michael Duff, Theory of everything: The road to unification, (New Scientist 01/06/2011 Despite this, SUSY is still very widely implemented in M-theory. This is because its mathematics gives a relation between quantum particles and space time thus enabling gravity to be associated with it too. This amalgamation of SUSY and gravity is known as Supergravity. This too has its own problems, namely to do with the eleven dimensions of space-time that it predicted. So a contrasting approach was turned to for more definite answers called Superstring theory. In this, the fundamental particles of matter are not considered point masses.
Instead they are to be thought of as one-dimensional strings that exist in not eleven but ten space-time dimensions. Just like an instrument’s string or a wave, they can oscillate at various modes and frequencies respective of the particle they represent. To begin with, it was extremely promising because it avoided the problems of eleven dimensional supergravity. The six extra dimensions could be curled up into hidden dimensions in the same way Klein suggested in 1926. But like the other components of M-theory, theorists had their doubts as superstring theory presented five different mathematical approaches that all worked equally well and thus all competed to be the accepted version.
The current M-theory is a mixture of all the effort put into the supergravity and superstring theories. It was formed in 1995 by Edward Witten, a string theorist from the Institute for advanced study in Princeton, USA. He says that the M stands for membrane. Witten showed that the theories involved weren’t competing but were in fact different features of M-theory. Thus the five different string theories and eleven dimensional supergravity were put under one distinct theory. It is important to remember that the mathematics of a single theory under the M-theory term cannot explain all of the universal laws. It only succeeds in a particular area. If two theories can be used to explain the same phenomena at a certain point then the theories overlap. If the ranges of the theories overlap then they can be said to agree and hence be parts of the same theory. This overlapping is essentially what M-theory is. It is the network of the many theories of everything, connected where certain parts of the individual theories correspond((Michael Duff, Theory of everything: Have we now got one?, (New Scientist 01/06/2011

The Grand Unification of the strong force with the electroweak is the next step to lead onto the more complex theories of everything, M-theory in particular. They ultimately aim to identify the set of fundamental laws that governs the whole universe. These laws must be able to explain every possible process from those on the microscopic quantum scale to the largest of black holes. This has not yet been achieved and there is speculation into whether an all unifying theory could actually exist. From the evidence, it gives the impression that this goal is indeed possible. Many separate theories have been unified in the past. Maxwell’s electromagnetism was the first significant advancement and only a century later the electroweak theory was proposed and proven with the evidence from particle accelerators. The track record illustrates that as technology improves and more discoveries are made, science will come closer and closer to a unified theory. This may be from refining the current theories or by having new revolutionising ideas that negate any of the flaws in prior ones. On the contrary, it may be impossible. No matter how mathematically elegant and precise to the reality that a theory will be, it remains a theory; a model that we perceive with human senses and process with human brain logic. Everything around us is interpreted by our brain that then builds our model of reality. What this point is suggesting is that any models there are to describe the entire universe will only ever be near perfect approximations at best((Stephen Hawking & Leonard Mlodinow, The Grand Design, (Bantam Press pub. 2010), pages 57-63)).  Similarly, there is a limit to what can be measured by technology, no matter how highly advanced it can become. Technology requires humans to manufacture it. Then it needs human operators who provide the means of inputted data and review the outputted results in one form or another.
Source: All the theories and principles devised over the past centuries link to form the Grand Unification and can go even further to a Theory of Everything

M-theory’s purpose is to eventually provide that model or theory to define the universal laws by considering all the relevant theories and observational evidence from the past, present and future; this makes M-theory the main candidate for producing the theory of everything. The progress and relative success of M-theory has resulted in a dilemma. Many years ago, Newton showed that mathematical equations could give surprisingly precise predictions for the interactions between objects. From this, many scientists supposed that with enough mathematical processing, the entire past, present and future of the universe could be determined with the one ultimate theory. Then the hurdles of quantum chromodynamics, quantum uncertainty, curved space-time, tiny vibrating strings and extra dimensions announced themselves.
The effort of generations of physicists has the outcome of 1 x 10500 different universes. Each one with its own set of laws where only one of these represents our actual universe((Stephen Hawking & Leonard Mlodinow, The Grand Design, (Bantam Press pub. 2010), page 149-152 Other references: – Frank Close, Particle Physics – a very short introduction, (Oxford Uni. Press pub. May 2004), – James Clerk Maxwell, A dynamical theory of the electromagnetic field, (original paper pub. 1st January 1865,   PDF version available at, – Matthew R. Francis, A GUT feeling about physics (pub. 04/28/16 Perhaps a consequence of this is that the idea of finding a single theory to explain everything has to be abandoned. If not, then the problem still remains. M-theory allows for many different possibilities, so how can it be narrowed down to the model that best suits our universe?


Leave a Comment

Your email address will not be published. Required fields are marked *