← Thunderblog Archive

Energy, Mass, and Light

Energy, Mass, and Light
by Mathias Hüfner

I always felt that science as the domain of people from Oxbridge or Ivy League universities — and not for mere mortals — was a very bad idea.

— Benoit Mandelbrot

A Brief History of Physics

The social requirement on physics is to study nature’s energy supply so that we can derive technical processes from it. At least that was the demand on physics at the time of my studies 50 years ago. Modern physics seems to be very far removed from this nowadays.

To put it bluntly, energy is in moving masses, and forces set masses in motion, as Isaac Newton derived from the movements of the heavenly bodies. The theory of the movements of the masses is summarized under the concept of dynamics. Now an artificial sun called SAFIRE1 has risen in the laboratory sky and has finally shattered nowadays split world view into the macro and microcosm of physics. The old thought pattern of the symmetry of an expanding, closed world, which can be described using an equation, has had its day. A new way of thinking, a paradigm shift in Thomas Kuhn’s sense, is necessary. The world is to be understood as a dynamic process in an open system.2 In it, dynamic structures over a large range of scales are not symmetrical but are self-similar.

If you study historical sources on dynamics in physics, one can recognize different phases after the beginning of the Enlightenment. The 18th century was the century of mechanics. This epoch benefited from the infinitesimal calculus of continuous functions introduced by Newton. The 19th century was the century of research into electricity. As early as 1836, the Italian Fabrizio Mossotti3 developed the idea of an electromagnetic world view, which inspired Michael Faraday. This was strengthened by the work of Joseph Larmor (1897) and Wilhelm Wien (1900). Even then it was recognized that masses carry electrical charges and that all forces are of electromagnetic origin. However, the theory of electricity developed by James Clerk Maxwell divided physicists into adherents of an electromagnetic ether theory and atomists, who favored long-distance gravitational forces, since at that time one could not yet imagine the difference between bound and free shieldable charge. The 1st Solvay Conference of 1911 in Brussels, initiated by Walter Nernst, on the “Theory of Radiation and Quanta” was intended to find a compromise between the ether advocates, represented by Hendrik Antoon Lorentz, who wanted to further develop Maxwell’s theory, and the atomists, represented by Max Planck and Albert Einstein.

. . . the fundamental and fruitful ideas of Planck and Einstein should serve as the basis of our discussions, we can modify or improve them, but we cannot ignore them . . .

— Walther Nernst

It was not recognized at the time that Einstein’s idea was less fruitful than assumed, since it went against Maxwell’s dynamics, and Planck’s constant referred only to the action of an electron and was not a natural constant.

Compromises in the field of politics are good, because there it is a question of balancing interests. However, science is about knowledge and its logic is based on a two-valued algebra with the values true and false, from which higher mathematics is also derived. The proportional relationships of natural phenomena are particularly interesting for physics, because one would like to derive natural constants from them, on which one wants to base the system of units of measurement. This gives the impression that the units of measurement are not social agreements, but have a universally valid objective character. This applies in particular to the speed of light, Planck’s constant, the elementary charge, the Boltzmann constant and the gravitational constant. However, this is highly problematic since these constants are obtained in a measurement process that is only valid within a certain measurement range and beyond this measurement range there are no longer any guarantees. In the field of optics, the constancy of the speed of light no longer applies, because a change in direction of an optical wavefront is only possible as a result of changes in transit time between inside and outside. The Doppler effect would also not exist without transit time differences. The proton has a different effect due to its 1836 times greater mass. In general, modern physics has a problem with understanding mass. There is confusion with the concept of matter or force.

The concept of matter is not a physical concept but a philosophical category. Matter includes everything that exists outside of our consciousness. We reflect matter in our consciousness through sense perceptions. We can perceive forces with our senses, so by definition they are material. Anyone who cannot distinguish between these terms will never be able to understand physics.

Mass, on the other hand, means nothing more than an uncountable quantity. When a farmer sells grain, nobody thinks of buying it by the number of grain kernels. As early as ancient Egypt, countable equivalents were introduced for the subdivision of the mass of grain kernels for comparison using the beam balance. To speak of an increase in mass as a result of acceleration is therefore absurd given the definition of mass. The apparent increase in mass that results from the manipulation of Einstein’s energy formula is nothing other than the mechanical resistance in the direction of movement, which the ether or, better, the electromagnetic force field, opposes to the moving body.

On the threshold of the 20th century, another factor influencing physics was added. As a result of industrialization, workers organized themselves into non-church unions. Thus the Catholic Church saw its power dwindle. Pope Pius X lamented in his 1907 encyclical that science was no longer serving as the handmaid of theology.

Lemaître’s work aimed at reconciling science and faith, as he put it. He did this by uniting leading scientists of the 20th century in the Papal Academy of Sciences, thus bringing them back under the control of the Church. So the world model he designed from a giant atom was replaced by the Big Bang and further developed in the standard models of the macro and microcosm, which modern physics split into the theory of relativity and quantum mechanics, were further developed. Albert Einstein defined energy as the product of mass and the square of the speed of light, which he believed to be constant. Since, in his theory, a ray of light ran like a “rigid axle of a railroad car”, the imaginary world had to bend like railroad tracks. Max Planck, on the other hand, defined energy as the product of the quantum of impact and frequency, whereby he also declared the quantum of impact to be a natural constant. However, a household hammer drill has a much greater impact then a single electron and what effect a tsunami wave can have is certainly still remembered by many people. In doing so, these two “titans” of physics had cemented the split in physics.

For if energy in the macrocosm were equal to energy in the microcosm, then it would follow that mass would be proportional to frequency, which contradicts experience. Strangely enough, the majority of the physicist community accepted this contradiction without complaint and sought to unify both theories in a theory of quantum gravity up until the 21st century.4

It would be easier if we dropped the scholastic idea of the God-given constancy of nature and would took into account the degree of its dynamics on the various scales. The mathematician Benoît Mandelbrot made a major contribution to this. He brought back to our consciousness the ancient knowledge that nature is fractally divided into four phases. Until now, the continuity of the infinitesimal calculus had priority over the physical effects at the phase boundaries, where the mathematical description fails because the functions cannot be rectified.

But at least in the 20th century, the relativistic view of nature settled the age-old dispute as to whether the sun revolved around the earth or the earth revolved around the sun. Statements by Einstein testify to this, saying that it doesn’t matter whether the train or the platform is moving. So Galileo was rehabilitated by Pope John Paul II in 1992 after 359 years. The thought experiment of Schrödinger’s cat which characterizes the properties of quantum experiments, follows the same line of thought. It describes a cat which vegetates as a zombie in the box and only the observer decides whether it is alive or dead by opening the box.

It is understandable that such and similar statements are disturbing for logically thinking people on the one hand, and on the other hand have magically attracted esotericists, which was intended, because the old mystery of the immaculate conception of the virgin had finally had its day. Well, that is still the state of academic science in this days.

We can state that Lemaître’s dream of reconciling science and faith has been fulfilled, at least in theoretical physics. Pope Pius X’s commandment, written in his 1907 Encyclical5, was fulfilled, and those who wish to pursue a career should adhere to the dogmas of academic teaching. The Pontifical Academy of Sciences monitors the purity of this teaching with an interest-driven peer review system, which now enjoys worldwide influence but increasing has come under criticism.

In the 1980s, the 30-year black hole war broke out between Leonhard Susskind, the advocate of quantum theory, and Stephen Hawking, the advocate of the theory of relativity, which Hawking ended with the acknowledgment that black holes do not exist according to the theory, but that the term could be given a new meaning.

Meantime, the army of doubters is growing unnoticed. The list of known dissident scientists has grown to over 10,000 and their ideas are also becoming more and more diverse, since they mostly appear as lone fighters6, which of course severely limits their options. Undeterred by this, leading academic theorists continue to carry out expensive experiments with dubious outcomes and, in accordance with church guidelines, are even rewarded with Nobel Prizes, although they are diametrically opposed to the idea of Nobel, who decreed that the prize should go to whoever has brought the greatest benefit to society.

The decisive experiment based on the Tokamak principle, in which a high-temperature plasma is supposed to ignite autonomous nuclear fusion, has been falsifying the theories of particle physics with constant regularity for the past 70 years, and at temperatures far above the temperature of the sun. We want to ignite the solar fire on earth, but our academic science doesn’t understand the cosmos. So let’s forget about relativity and quantum mechanics for a while and start our thinking at the end of the 19th century with the ether drift experiments. The decisive question is whether light is distributed isotropically or anisotropically as a wave. In other words, the Doppler effect applies to light. Is applicable

Einstein’s theory of relativity claimed the opposite. His theory calls for a constant overall speed, but the resulting overall speed decreases with wavelength in the Doppler effect. The Michelson-Morley interferometer experiment was ruled negative, although Dayton Miller later confirmed the inequality at Mount Wilson and fairly accurately determined the Earth’s velocity v with the Sun around the center of the Milky Way. Only nobody could do anything with the measured value at that time, since it did not correspond to the expected orbital speed of the earth around the sun.

Einstein should have withdrawn his theory. Wanting to avoid this, he claimed that temperature effects were the cause of Miller’s finding, although he had never actually looked at Miller’s work, Miller told the Cleveland Plain Dealer newspaper on January 27, 1926. Thus, Einstein’s authority triumphed over reality. But the assertion of the expansion of the universe according to Lemaître while maintaining the theory of relativity is based precisely on the above inequality. A scientist should have noticed such a serious contradiction in Einstein’s theory.

Was it about science at all, or was it about the reconciliation of science with faith, as Lemaître formulated Pope Pius X’s demand, namely that science should serve theology? Finally, in 1951, Pope Pius XII7 enthusiastically proclaimed Lemaître’s theory in a speech to the Pontifical Academy as scientific progress, especially the cosmology of the Big Bang, which for him was, so to speak, proof of creation, without naming the originator of the hypothesis. Even more: wouldn’t the birth of the world from a primeval beginning be a proof of God? With that, to the horror of Lemaître, who saw science and faith side by side, the germ for the growth of that that we now call modern physics, namely the merging of faith and science, was born into the world.

Fiat lux, — let there be light!

At least in one case me c2=h ν applies when me is the mass of an electron, whereas in the case of a stable elementary particle, the term mass loses its meaning. But then c and h cannot be natural constants at the same time. That would mean that the mass would be proportional to the frequency, which contradicts experience, because the deep tones are produced by the large organ pipes and the high tones come from the small pipes. The same applies to the antennas. The smaller the antenna, the higher the frequency. From my work in precision mechanical optical device construction, I know that c is to be regarded as a material constant. So if, according to God’s representative, there should be light, this is not possible without the electromagnetic properties of the propagation medium, because Wilhelm Weber and Friedrich Kohlrausch discovered the relationship

in 1855. The propagation speed of light is not a vector. On the other hand, particles are already moving in a certain direction. The electron has a translational speed and a rotational speed. Adding the two together, the result should not exceed the speed of the light pulse radiated in all directions.

Then you can write 1/2mev2trans+1/2mev2rotmec2for the projection into the translation plane and transform this expression and thus we obtain for the momentum of a light quantum triggered by an electron:

This means that a light quantum is spatially distributed within a medium depending on its electromagnetic density and spectrally shifted in the direction of longer and longer waves and that Maxwell’s equations hold at all scales. While Planck’s constant h refers to the electron as the smallest quantum of action, each massive body has its own quantum of action. If electrons oscillate in the nanometer range, then ions oscillate in the micrometer range. Radiant energy occurs as gamma rays to thermal radiation and is the driving force of the dynamics. We only perceive a small part of this spectrum as visible light. When we perceive radiation, it is always bound to moving charges, which in turn are bound to masses. So light cannot come out of nothing, as advocates of the big bang hypothesis claim, and space cannot expand because the number line of real numbers is dense. There are no new numbers in between. This fact should be obvious to any logically thinking person.

When I look at pictures of volcanic eruptions, it amazes me why dynamics is separated into mechanics, and these into point and fluid mechanics, into electrodynamics and thermodynamics, and treated in different academic principalities. Why aren’t astrophysicists trained in the basics of electrodynamics and thermodynamics? Why do particle physicists perhaps understand something of their mathematical tools, but cannot convert their results into physically understandable knowledge? I recall that they claimed they had found the “god particle”, the Higgs boson, which “would transfer mass“ to other particles. Masses are not transferable, only impulses.

Although physicists constantly operate with the terms matter and mass, they cannot distinguish between these terms. So they confuse masses with forces. They divide forces into four types, although forces can only be distinguished by magnitude and direction. Relativists among physicists cannot distinguish the concept of space from the concept of surface describing two different qualities.

As already mentioned, mass means nothing more than an innumerable quantity of elementary particles, whereby elementary depends on the scale of the observer. Needless to say, the scale for grain kernels is different from that for protons and electrons. Looking at WIKIPEDIA, few people seem to know what mass really is these days. There is too much text to understand the term itself. The term is even used in particle physics, where quanta can be counted, but mass is confused with force although moving masses act as forces. The source of the forces that move masses are the distributions of electric charges on protons and electrons, which after many years of research are still the only stable particles ever found. However, they do not appear to be spherical in shape. We must think of them more as vortices8 if we want to stay compliant with electrodynamics.

As Maxwell and Helmholtz perhaps suspected, all properties of matter arise from the structural variation of electric and magnetic vortices, and atomic decay, as well as the fusion of atoms are electromagnetic phenomena. All structures and distributions of mass and kinetic energy are derived from this and are expressed in their fractal nature. Nature lives on dissipated energy. But physics does not yet have a unit of measure for fractality.

Radiant energy is dissipative energy

The dimension of H is A / m. There is a rumor that energy can be converted into mass and mass into energy. One even claims to have observed a mass defect. In 1960 the DEFA film The Silent Star addressed this rumor. Of course, that’s nonsense. Energy needs a carrier, and that carrier is the mass of particles. Radiation has no mass. It is an impulse that the electromagnetic field transmits.

The relation mec2must therefore be interpreted as follows. When an electron exerts an effect on the electromagnetic field with frequency ν, it spreads unnoticed throughout the whole volume at the speed of light up to the phase boundary. The effect only becomes noticeable at the phase boundary. This is the tsunami effect and not the dual character of light. Conversely, of course, the energy of the electromagnetic field can also push an electron into a higher energetic state, provided the environment has a higher potential than the electron. The emission of electromagnetic waves, like light, involves impulses that are distributed through a force-coupled medium. We call this physical medium ether or vacuum. It’s just not an empty mathematical space and, therefore, you can’t equate matter with mass, because a volume has a mass density in a state of motion that we determine via electromagnetic radiation, but identify as temperature. This is distributed energy, i.e. dissipative energy. However, the distribution of energy is not homogeneous. We must therefore distinguish them from homogeneous usable energy.

In physics, dissipative energy is evaluated with the term entropy. It is the part of the energy that is distributed to the environment during the conversion process into usable energy. Nicolas L.S. Carnot found out in 1824 that this part makes up about 2/3 of the total energy, and we mostly forget that we are heating up our environment with this part. But in 1877 Ludwig Boltzmann was the first to come up with the brilliant idea for explaining entropy. Boltzmann thought of energy as divided into small “energy portions” and the possible microscopic states as “boxes” in which these portions are stored. There are four ways of distributing three “portions of energy” to two boxes. If we double the volume of the container, i.e. provide four boxes, there are already 20 ways of distributing the “portions of energy”. If we assume that the energy is distributed evenly over the boxes, these portions of energy will be distributed with the logarithm of their probability W as the number of boxes increases. So he got:

where k is called the Boltzmann constant. This gives us a measure of the energy distribution. In most cases, however, only the change in entropy is of interest. In a closed system, this change is always positive. That says the 2nd Law of Thermodynamics. For example, ice melts in a cup of warm water. The crystal order dissolves. However, classical thermodynamics is incomplete. There must also be the opposite, otherwise there would be no ice. When I dissipate the heat, the structural order is restored. In this case, the change in entropy is negative. For this, however, the system must be open and the environment must be such that it can absorb entropy.

This is actually an ancient Indian wisdom that Ilya Prigogine9 clothed in a modern formula in the 1960s. It is the wisdom of the creation, preservation and decay of structures called Trimurti and symbolized by the three main gods of Hinduism.

Creation is synonymous with bringing order out of chaos. This means that the Creator must declutter. The organization of the Creator remains irrelevant in this consideration. Self-organization is often spoken of here, but this is misleading, since an organization pursues a goal that cannot be recognized on the physical level.

In physical language this means: The entropy change of the system dSsystem < 0 must become negative, which is only possible in an open system. This is not possible in a closed system because of the second law of thermodynamics. (It doesn’t mean that something comes out of nothing. If you multiply something by zero, the result is always zero. Even gods have to bow to the law.)

In order for regularity to be able to develop internally, the external entropy must be negative and more entropy must be dissipated than is left over internally and is added. This can be expressed by the three relations:

Now it is a misconception that only energy with frequencies in the range of thermal radiation are distributed. No upper frequency limit was found for energy dissipation. The thermodynamic laws also apply in electrodynamics and mechanics, where they are usually neglected, with the fatal conclusion that energy can be converted into mass and vice versa mass into energy.

What fusion research has not considered

For more than 70 years, many physicists have dreamed of generating energy based on the principle of the sun, but all experiments show that you have to put more energy into it than you get as radiation. Obviously the principle by which the sun works is not understood, because it is believed that the sun’s body consists of hydrogen, which under high pressure would fuse to form helium, whereby a loss of mass would result that would be converted into radiant energy. Mass loss? This contradicts the conservation laws.

The sun moves in a huge stream of plasma around the center of the Milky Way and does not get the fuel hydrogen from inside but from outside. Hydrogen is the most common element in all active galaxies, as the spectroscopic investigations show. Consequently, the nuclear fusion that produces the massive amount of radiant energy cannot take place inside the sun, because then it like a hydrogen bomb would have exploded long ago. The radiant energy is released at the transition from the corona to the chromosphere, where there is a temperature gradient of three decimal places over a few kilometers, and there the hydrogen is supplied to the sun from the outside. According to the spectral class, the solar body should consist of a Ca ion melt on the surface, which has a positive charge, which is why the electron shells are torn from the hydrogen atoms and the protons are decelerated sharply, which leads to a traffic jam with mass collisions. The non-fused protons are thrown back into the Heliopause as solar wind which ionizes further gas atoms near the earth, which form the negative and positive Van Allen radiation belt in the earth’s magnetic field.

Physicists attempt to impart very high thermal energy to the light atomic nuclei to be fused, in order to overcome the repulsive electrostatic forces between the atoms. They think they need 300 million degrees to 1 billion degrees. However, this means that the internal entropy is greatly increased compared to the external entropy. So |dSext | « dSint . But this completely contradicts the requirement for building an internal order, which is nuclear fusion at the atomic level. So if nuclear fusion is to succeed, then you have to ensure that most of electrons of the atomic shells are sucked away, and then the atoms have to be slowed down so that a mass collision occurs. It is not the level of temperature that matters, but the temperature gradient and the electric field, so that protons can combine with the few remaining electrons to form larger ions. The electron vortices decrease by three orders of magnitude compared to the electron shell in order to give off the enormous radiant energy and thus become core electrons, that we always identify in the beta decay of the nucleus. This has nothing to do with mystical conversion of mass into energy.

It shows that the laws of dynamics hold over many orders of magnitude. The problem lies in the division of academic physics into chairs that act like independent principalities, distinguish themselves from each other and fight for their autonomy. In doing so, they do not serve society but rather a scholastic believe.


1 Montgomery Childs – The SAFIRE Project ,

2 Mathias Hüfner – Der Kosmos im Lichte der Systemtheorie; system.pdf

3 Ottaviano Fabrizio Mossotti: Sur les forces qui régissent la constitution intérieure des corps, aperçu pour servir à la determination de la cause et des lois de l’action moléculaire. Turin 1836.

4 L. Smolin – Three Roads To Quantum Gravity;

5 Pope Pius X – Encyclical a Pascendi Dominici gregis; Vatican September 8, 1907

6 Jean de Climont – The Worldwide List of Alternative Theories and Critics ; id=KnzBDjnGIgYC&redir_esc=y

7 23_papstpiusxiiakceptierturbang as creationnovember1951_wdr5.mp3

8 M. Hüfner – Modern astrophysics meets engineering sciences; meet-auf-ingenieurwissenschaften-mathias-huefner-9783752628067

9 Ilya Prigogine and Isabelle Stenger – Order out of Chaos; -org.pdf

Dr. Mathias Hüfner is a German translator volunteer for The Thunderbolts Project. He studied physics from 1964 until 1970 in Leipzig Germany, specializing in analytical measurement technology for radioactive isotopes. He then worked at Carl Zeiss Jena until 1978 on the development of laser microscope spectral analysis. There he was responsible for software development for the evaluation of the spectral data. Later he did his doctorate at the Friedrich Schiller University in the field of engineering and worked there 15 years as a scientific assistant. Some years after the change in East Germany, he worked as a freelance computer science teacher the last few years before his retirement.

Since 2015, Mathias has run a German website of The Thunderbolts Project and his latest book is entitled Dynamic Structures in an Open Cosmos

The ideas expressed in Thunderblogs do not necessarily express the views of T-Bolts Group Inc. or The Thunderbolts Project.

Print Friendly, PDF & Email

← Thunderblog Archive

Print Friendly, PDF & Email