The End of Physics?

Thirty years ago political scientist Francis Fukuyama published The End of History and the Last Man where he argues that, with the ascendancy of Western liberal democracy, which occurred after the Cold War and the dissolution of the Soviet Union, humanity had reached “… the end of history as such: the end-point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government.” Last year Fukuyama published a new book, Identity: The Demand for Dignity and the Politics of Resentment. Here is an excellent review.

Fukuyama was influenced by German political philosophers Hegel and Marx and the principle of dialectic – thesis, antithesis, synthesis. Both take a teleological view, that there is a purpose underlying the unfolding of events and that the course followed by world history is a necessary one, that it is deterministic.  The dialectic principle certainly provides great insight into the evolution of ideas but its proponents tend to use it post hoc to justify their prior beliefs:  for Hegel, the superiority of the Prussian State, for Marx, the inevitable Dictatorship of the Proletariat, for Fukuyama, the perfection of Western Liberal Democracy.

If we substitute  speculation, observation, theory for thesis, antithesis, synthesis, the dialectic principle applies well to the scientific method but without these deterministic and post hoc aspects. Theory then gives rise to further speculation  which inspires further experiment and observation and so the process continues.

Unlike other great philosophical schemae, science has only existed for the last few centuries. While speculations about the nature of reality have always abounded, the essential step of testing-by-observation has tended to be ignored or overlooked by most thinkers throughout history. Because of this such speculations have tended gel into rigid dogmas where testing-by-observation becomes anathema. Science breaks down.

Unfortunately this is happening again. According to former particle physicist,  Sabine Hossenfelder:

No one in physics dares say so, but the race to invent new particles is pointless. In private, many physicists admit they do not believe the particles they are paid to search for exist.  They do it because their colleagues are doing it.

Since the 1980s, physicists have invented an entire particle zoo, whose inhabitants carry names like preons, sfermions, dyons, magnetic monopoles, simps, wimps, wimpzillas, axions, flaxions, erebons, accelerons, cornucopions, giant magnons, maximons, macros, wisps, fips, branons, skyrmions, chameleons, cuscutons, planckons and sterile neutrinos, to mention just a few.

There are many factors that have contributed to this sad decline of particle physics. Partly the problem is social: most people who work in the field genuinely believe that inventing particles is good procedure because it’s what they have learned, and what all their colleagues are doing. But I believe the biggest contributor to this trend is a misunderstanding of Karl Popper’s philosophy of science, which, to make a long story short, demands that a good scientific idea has to be falsifiable. Particle physicists seem to have misconstrued this to mean that any falsifiable idea is also good science. In the past, predictions for new particles were correct only when adding them solved a problem with the existing theories. For example, the currently accepted theory of elementary particles – the Standard Model – doesn’t require new particles; it works just fine the way it is. The Higgs boson, on the other hand, was required to solve a problem. The antiparticles that Paul Dirac predicted were likewise necessary to solve a problem, and so were the neutrinos that were predicted by Wolfgang Pauli. The modern new particles don’t solve any problems.

These “new particles” are not speculations about the nature of reality. They are the outcome of the routine application of a schema know as Particle Physics for the purpose of kudos and funding.

The speculation by the Swedish chemist Svante Arrhenius in 1896 that increasing atmospheric concentrations of CO2 could lead to global warming, was revived in the mid 20th Century under the aegis of the United Nation’s Intergovernmental Panel on Climate Change (IPCC). Unfortunately the testing-by-observation aspect of a truly scientific enterprise was intentionally omitted by the IPCC. Its Third Assessment Report specifically dismissed the need for rigorous testing when it stated: “our evaluation process is not as clear cut as a simple search for ‘falsification’” (Section 8.2.2 on page 474). Effectively what they are saying is: proper scientific testing is too hard and we are not going to bother doing it. All of the funding and effort goes into ever more complex numerical models. Here too, testing-by-observation has become anathema. A recent paper showing that aspects of the model carbon cycle did not fit the observations was rejected on the grounds that “exceptional claims require exceptional evidence”.  A more appropriate comment might have been”The Emperor has New Clothes”.

Climate Modelling has been creaming climate funding for the last half-century. It is an ongoing boondoggle owned by Applied Mathematics. Other disciplines don’t get a look in, particularly Statistics which is seen as a threat. This is unfortunate because a synthesis of the two would provide useful insights. For example, regression modelling of local climate sensitivity due to increased CO2 shows that it varies widely, being least in the North Atlantic and strongest in northern Siberia and northern Canada. Increases in  extreme weather events in some regions (such as NE NSW and SE Queensland) are indicated by significant increases in insurance claims for storm and bush-fire damage whereas there has been no significant change in tropical cyclone frequencies. It seems likely that  climate variance and extreme events are related to the spatial gradient of local climate sensitivity rather than sensitivity itself, but such an effect is statistical and may not show up in a deterministic climate model.

The insurance industry will need to step up with some research funding for statistical modelling of climate if it wants reliable answers to such questions.

There are other areas of physics which appear to be stagnating. One such is Astrophysics and the concept of “Dark Matter”. Dark Matter plays a similar role to that of the luminiferous aether  in the 19th Century; it is a blatant “fix up”.

The existence of aether was postulated because  light forms interference fringes and therefore must be made up of waves. Waves require a medium to carry them, just as solids, liquids and gases carry sound waves. The aether was hypothesised solely as the medium which carries light. The Michelson-Morley experiment showed that this medium does not exist and Maxwell’s field equations provided an excellent description of the behavior of light as an electromagnetic wave.

In a similar way, Dark Matter has been postulated solely to account for the observed rotation of galaxies. The problem is that no-one knows what it is or how it comes to be distributed in just the right way to give rise to the observed galactic rotation. An explanation is more likely to be found in terms of a reinterpretation of Einstein’s Field Equations. Whatever the explanation, it seems unlikely that the answer lies in the plethora of proposed new particles discussed above.

Once again from Sabine Hossenfelder:

How do black holes destroy information and why is that a problem?

She concludes: As you have probably noticed, I didn’t say anything about information. That’s because really the reference to information in “black hole information loss” is entirely unnecessary and just causes confusion. The problem of black hole “information loss” really has nothing to do with just exactly what you mean by information. It’s just a term that loosely speaking says you can’t tell from the final state what was the exact initial state. There have been many, many attempts to solve this problem. Literally thousands of papers have been written about this.

Ay, there’s the rub – thousands of papers.

How many experiments?

Zero!