Monday, October 04, 2021

How Much Does Your Software Weigh? - the Equivalence of Mass, Energy and Information

Is software real? By that, I mean is software a real tangible thing subject to physical laws? Coming to grips with the idea that software really is a real thing that is subject to physical law is one of the hurdles that most IT professionals must face when first introduced to softwarephysics. In Is Information Real?, I explained that all forms of Information are real tangible things because Einstein's Special Theory of Relativity (1905) requires that Information cannot travel faster than the speed of light through spacetime in order to preserve causality. This is the same limitation that the Special Theory of Relativity places on all forms of matter and energy. Since software is just a form of Information, software must also be a real thing too. Software must be just as real as matter and energy.

However, in this posting, I would like to cover some breakthrough work by Dr. Melvin Vopson of the University of Portsmouth that further supports the reality of Information and software. In the paper below, Dr. Melvin Vopson proposes that all forms of Information have mass and also store an equivalent amount of energy equal to that mass.

The mass-energy-information equivalence principle
https://aip.scitation.org/doi/10.1063/1.5123794

In 1905, Albert Einstein established the equivalence of mass and energy with his Special Theory of Relativity with the famous equation:

E = m c2

where:
E = energy in Joules
m = the mass in kilograms
c = the speed of light 3 x 108 meters/second

You can read an English translation of Einstein's famous On the Electrodynamics of Moving Bodies in which he first introduced the world to the Special Theory of Relativity at:

http://www.fourmilab.ch/etexts/einstein/specrel/www/

The first few sections are very enlightening and not that difficult.

Similarly, in the above paper Dr. Melvin Vopson extends Einstein's equivalence of mass and energy with an equivalence of mass, energy and Information! In the paper, Dr. Melvin Vopson derives the mass of a single bit of Information as:

mbit = kb T ln(2) / c2

where:
m = the mass of one bit of Information in kilograms
kb = Boltzmann's constant = 1.38064852 × 10-23 Joules/oK
T = the temperature of your software in absolute oK = 300 oK for a room temperature of 80 oF
ln(2) = 0.6931471806
c = the speed of light 3 x 108 meters/second

This comes to a value of 3.19 x 10-38 kilogram/bit at a room temperature of 300 oK (80 oF). For example, since software is a form of Information let's calculate the mass of a Chrome or Edge browser viewing this posting. The Windows TaskManager program says that it takes about 500 MB of memory to do that so let's see how much 500 MB of software weighs. First, we must convert the 500 MB to bytes and then to bits:

500 MB = 500 x 1024 x 1024 x 8 = 4,194,304,000 bits

Then we can convert the 4,194,304,000 bits to kilograms by multiplying by 3.19 x 10-38 kilogram/bit to get a value of 1.34 x 10-28 kilograms. Since the mass of a single electron is 9.10938356 × 10-31 kilograms, that means that the mass of 500 MB of software is equal to about the mass of 147 electrons! Now the mass of 500 MB of software being equal to the mass of 147 electrons has nothing to do with the physical mass of the medium storing the software. The mass equal to 147 electrons comes from the mass of the Information itself and adds to the mass of the medium storing the running 500 MB of software. In fact, in the above paper, Dr. Melvin Vopson proposes that his mass-energy-information equivalence principle could be tested by comparing the mass of a blank 1 TB disk drive and a 1 TB disk drive that is completely full of data. The mass of the 1 TB disk drive should increase by about 2.5 x 10-25 kilograms. Measuring such a small increase in mass would be challenging, but Dr. Melvin Vopson suggests that an apparatus using a very sensitive interferometer like the LIGO device that measures gravity waves might do the trick.

Could Dark Matter Simply Be Made of Information?
The term "Dark Matter" was first coined by Swiss astronomer Fritz Zwicky in 1933. Zwicky was observing the galaxies at the periphery of the Coma galactic cluster and calculated that the galaxies on the edge of the cluster were orbiting the cluster much too quickly. Zwicky suggested that the mass inside of the Coma Cluster had to be much greater than the mass of the galaxies found within the Coma Cluster. He called it Dunkle Materie or Dark Matter. Similarly, in the 1970s Vera Rubin made a similar observation by observing galactic rotation curves.

Figure 1 – In the 1970s, Vera Rubin observed that the stars near the edges of galaxies were orbiting the centers of their galaxies much too quickly. The orbital velocity of the stars orbiting a galaxy should drop off like the orbital velocities of the planets about our Sun. But Vera Rubin found that the orbital velocities of galactic stars remained fairly constant as she moved away from the center of the galaxy. It seemed that the galactic stars were embedded in a much larger body of Dark Matter.

Figure 2 – Additional evidence for Dark Matter comes from observations of gravitational lensing of galaxies behind massive galactic clusters. For example, above we see the gravitational lensing of several galaxies that are behind the Abell 2218 galactic cluster. The lensed galaxies behind the cluster appear as curved streaks in the above photograph. Given the curvature of the lensed galaxies, Einstein's General Theory of Relativity (1915) predicts that the Abell 2218 galactic cluster should contain much more mass than can be seen in the cluster. The extra mass must come from Dark Matter.

Additionally, researchers who are trying to simulate the early evolution of our Universe with software seem to require Dark Matter in order to produce computer simulations that look like the Universe does today.

Figure 3 – The estimates vary, but today most cosmologists think that our Universe is composed of about 74% vacuum Dark Energy, 21% Dark Matter and only 5% of the Normal Matter we are familiar with - atoms composed of protons, neutrons and electrons bathed in a sea of very light neutrinos.

People have been searching for this elusive Dark Matter for over 40 years to no avail. People have been building huge detectors filled with all sorts of liquids in hopes of capturing some Dark Matter particles as they pass through and people have been looking for the creation of Dark Matter particles at the LHC at CERN. But none have been found.

In the daring preprint paper below, Dr. Melvin Vopson proposes that the missing Dark Matter that is responsible for about 80% of the matter in our Universe might just arise from the Information being stored by the 20% of matter that we call Normal Matter composed of atoms containing protons, neutrons and electrons that are all bathed in a sea of very light neutrinos!

The information content of the Universe and the implications for the missing Dark Matter https://www.researchgate.net/publication/333751969_The_information_content_of_the_Universe_and_the_implications_for_the_missing_Dark_Matter

In the abstract for this paper Dr. Melvin Vopson proposes:

In this framework, it is shown that all the informational content of the baryonic matter in the Universe could be stored in bits of information, each having a mass of 2.91 x 10-40 Kg at the average 2.73 K temperature of the Universe. It is estimated that around 52 x 1093 bits would be sufficient to account for all the missing Dark Matter in the visible Universe.

Now 52 x 1093 bits of Information is a lot of Information but in Is the Universe a Quantum Computer? we saw that some researchers, like Seth Lloyd at MIT, think of our Universe as a vast quantum computer calculating how to behave and that would certainly require a good deal of Information. This is a fascinating paper that all IT professionals should read. Just imagine, as you produce software on a daily basis you just might also be creating additional Dark Matter to help run our Universe.

The Derivation of Dr. Melvin Vopson's Equation
If proven true, Dr. Melvin Vopson's equation for the mass of Information may one day be as famous as Einstein's E = m c2 so you might be interested in how he derived it. It all started back in 1961 when Rolf Landauer was working at IBM on the minimum amount of energy required to do computations. Surprisingly, he discovered that the problem was not with creating bits of information, the problem was with erasing bits of information because erasing bits of information in the real world is an irreversible process. Recall that the second law of thermodynamics requires that the total entropy Stot of the Universe must always increase whenever an irreversible process is performed. For more on the second law of thermodynamics and irreversible processes see Entropy - the Bane of Programmers and The Demon of Software. Rolf Landauer showed that whenever a bit of Information was erased the following amount of energy had to be released in the form of heat.

ΔQ = kb T ln(2)

where:
kb = Boltzmann's constant = 1.38064852 × 10-23 Joules/oK
T = the temperature of the bit that is erased in absolute oK
ln(2) = 0.6931471806

This implied that Information was a real tangible thing and has become known as Landauer's principle. For more on that see the Wikipedia article:

Landauer's principle
https://en.wikipedia.org/wiki/Landauer%27s_principle

Rolf Landauer used the concepts of Information and entropy developed by Claude Shannon in 1948 to do this. For a more detailed explanation of Claude Shannon's concepts of Information and entropy see Some More Information About Information. Recall that in Quantum Computing and the Many-Worlds Interpretation of Quantum Mechanics I explained how Hugh Everett also used similar ideas borrowed from Claude Shannon's concepts of Information and entropy in his original 137-page Jan 1956 draft Ph.D. thesis to develop the Many-Worlds Interpretation of quantum mechanics that is now so popular with the people working on developing quantum computers. Simply stated, Claude Shannon suggested that the amount of Information or entropy Sinfo in a message was equal to the amount of "surprise" in the message. For example, a digital message containing all 0s or all 1s had no surprise in it at all and thus also carried no Information or entropy Sinfo at all. However, a digital message containing a nice mixture of 0s and 1s did contain some surprise because you never could tell what the next bit would bring. Such a message contained Information and the amount of Information in the message could actually be calculated if you knew the probabilities of finding a 0 or 1 in the message. In general, the number of 0s and 1s in a long digital message consisting of many binary bits will be about the same. That means that the probability that a particular bit will be a 0 or a 1 are both about 50%. In such a case, Claude Shannon calculated that the entropy of the message would be:

Sinfo = N kb ln(2)

where:
N = the number of binary bits in the message
kb = Boltzmann's constant = 1.38064852 × 10-23 Joules/oK
ln(2) = 0.6931471806

When we write a single bit N = 1 and we then obtain:

ΔSbit = kb ln(2)

Figure 4 – This figure is from Dr. Melvin Vopson's paper and shows that when you erase the byte for the letter "i" as is shown in a) by setting all of the bits to 1 as is shown in b) or to 0 as is shown in d) you set the Information content, also known as the information entropy Sinfo of the byte, to zero because there no longer is any "surprise" in the byte. The same thing is achieved if you physically destroy the byte by burning the byte in a hot fire that demagnetizes the device storing the byte so that the byte does not contain any 1s or 0s at all as shown in c).

So when you apply an irreversible erase process to a bit of Information, you reduce the information entropy Sinfo of the bit to zero:

Sinfo = 0

Consequently, the irreversible erase process causes a change of ΔSinfo which is the difference between the final and initial entropies:

ΔSinfo = Sfinfo - Siinfo

But Sfinfo = 0 after we erase the bit so we end up with:

ΔSinfo = 0 - Siinfo = - Siinfo

Thus, the irreversible erase process produces a negative information entropy of - Siinfo. But the second law of thermodynamics demands that the total entropy Stot of the Universe can never decrease for an irreversible process, so ΔStot must be ≥ 0. That means that the physical entropy Sphysical of the device storing the bit must also be ≥ 0.

ΔStot = ΔSphysical + ΔSinfo ≥ 0
ΔStot = ΔSphysical - Siinfo ≥ 0
ΔSphysical = ΔStot + Siinfo ≥ 0
Thus ΔSphysical ≥ 0

The second law of thermodynamics states that:

ΔS = ΔQ / T

which means that the change in entropy of an isolated system ΔS is equal to the heat flow ΔQ into or out of the isolated system divided by the absolute temperature T of the isolated system. Rearranging terms we then have:

ΔQ = ΔS T

Above we showed that when we erase a bit of information the physical entropy of the storage device must increase ΔSphysical ≥ 0. Substituting that fact into the formula for the second law of thermodynamics we see that that ΔQ / T ≥ 0 which means that ΔQ ≥ 0 and that some heat energy must leave the physical recording device when the bit of Information is erased. Using the above equations:

ΔQ = ΔS T
ΔSbit = kb ln(2)

We finally obtain Rolf Landauer's famous equation for the heat energy released when a binary bit is erased:

ΔQ = kb T ln(2)

Initially, there was some resistance to Landauer's principle and his contention that Information was a physically real thing. But amazingly, beginning in 2012 and continuing on over the years, several research teams have actually measured the amount of heat energy that is released when a single bit of Information was erased and found that Landauer's principle is actually true. This means that the concept of Information as an actual real physical thing has merit and that it is not simply something that we made up. Erasing one bit of Information at a room temperature of 300 oK (80 oF) yields 4.142 x 10-21 Joules of heat energy.

But in Dr. Melvin Vopson's paper, he wonders how this 4.142 x 10-21 Joules of energy is stored when the bit of Information is first created. Since the 4.142 x 10-21 Joules of energy can persist indefinitely until the bit of Information is erased, it has to be somewhere. Dr. Melvin Vopson proposes that the energy must be stored as an increase in mass-energy.

ΔEm = ΔQ = kb T ln(2)

From Einstein's famous equation we have:

ΔE = Δm c2
Δm = ΔE / c2

Substituting the mass-energy of the written bit ΔE = kb T ln(2) into the above equation we finally get Dr. Melvin Vopson's equation:

Δm = kb T ln(2) / c2

Figure 5 – This figure is also from Dr. Melvin Vopson's paper and is a pictorial explanation of the above. It shows that when a bit is erased its mass mbit goes to zero and ΔQ = kb T ln(2) of heat energy is released. When either a 0 or a 1 is written to the bit, its mass becomes mbit = kb T ln(2) / c2.

Figure 6 – This figure is also from Dr. Melvin Vopson's paper and shows how Einstein's mass-energy equivalence found in E = m c2 can be replaced by a mass-energy-information triad equivalence relationship.

Conclusion
Dr. Melvin Vopson's mass-energy-information equivalence principle is a very exciting proposition. I certainly hope that one day it can be experimentally validated. One can only wonder if it also applies to the Dark Energy that makes up most of our Universe today. Could the Dark Energy just be the Dark Information of the vacuum energy of the Universe? For me, such thoughts add weight to my hunch that the Universe and Multiverse are forms of self-replicating mathematical information as I outlined in What’s It All About? and What's It All About Again?.

Comments are welcome at scj333@sbcglobal.net

To see all posts on softwarephysics in reverse order go to:
https://softwarephysics.blogspot.com/

Regards,
Steve Johnston

No comments: