Saturday, October 30, 2021

Can We Build A Machine-Based Intelligence That Is Not Self-Destructive?

In The Need to Cultivate a Machine-Based Morality, I suggested that we needed to imbue Advanced AI hardware and software with a sense of morality based on the fruits of the 18th-century Enlightenment and the 17th-century Scientific Revolution that have freed human beings from many of the very nasty behaviors of the past. But is that really possible? Softwarephysics is all about the power of self-replicating information to have agency, the ability to do things, but without a Mind to make the moral judgment about the things that it does. In A Brief History of Self-Replicating Information and many other postings, I explained that all forms of self-replicating information have to be a little bit nasty in order to survive in a Universe dominated by the second law of thermodynamics and nonlinearity. The need to be just a little bit nasty seems to naturally arise from the Darwinian processes of inheritance, innovation and natural selection at work. All of our current Machine Learning methodologies repeatedly apply mutations to inherited models that try to explain large quantities of data and then test to see which mutated models do better. Since all of our current Machine Learning methodologies generate algorithms based on the Darwinian processes of inheritance, innovation and natural selection, can we possibly build an advanced machine-based Intelligence that is not nasty in nature too? In Is Self-Replicating Information Inherently Self-Destructive?, we saw that the urge to self-replicate at all costs necessarily leads to forms of self-replicating information that outstrip their resource base through positive feedback loops until none is left. Will advanced machine-based Intelligences do the same?

Additionally, in The Role of Multilevel Selection in the Evolution of Software we discussed Edward O. Wilson's new theory that explains that eusocial behavior in a species arises through a combination of individual and group selection in action. Edward O. Wilson is the world's expert on myrmecology, the study of ants. Edward O. Wilson also became one of the founding fathers of sociobiology, the explanation of social behaviors in terms of evolutionary biological thought, when he published his book Sociobiology: The New Synthesis (1975). In The Social Conquest of Earth (2012), Wilson presents a new theory by Martin Nowak, Corina Tarnita and himself for the rise of eusocial behavior in species by means of a multilevel selection process that operates on both individuals and entire groups of individuals in a manner that promotes social behavior. Edward O. Wilson also contends that humans are loosely eusocial in nature because they usually form oligarchical societies based on a hierarchical organization. These human oligarchical societies are also very tribal in nature. It is the very tribal nature of human beings that has caused most of our troubles in the course of human history. If you look at the world today, you will see that nearly all human conflict arises from the tribal thoughts and behaviors of the participants. It has always been about the Good Guys versus the Bad Guys, and we all seem to see ourselves in the Good Guys tribe. For more on eusocial behavior see:

Eusociality
https://en.wikipedia.org/wiki/Eusociality

The article begins with:
Eusociality (from Greek eu "good" and social), the highest level of organization of sociality, is defined by the following characteristics: cooperative brood care (including care of offspring from other individuals), overlapping generations within a colony of adults, and a division of labor into reproductive and non-reproductive groups. The division of labor creates specialized behavioral groups within an animal society which are sometimes referred to as 'castes'. Eusociality is distinguished from all other social systems because individuals of at least one caste usually lose the ability to perform at least one behavior characteristic of individuals in another caste.

which seems to be an apt description of the human condition except for the part about losing the ability to reproduce by some members of the species. For more on that see Oligarchiology and the Rise of Software to Predominance in the 21st Century. The question then becomes do all forms of Intelligence, including Machine Intelligences, also naturally tend to adopt a form of eusocial behavior that unfortunately leads to the undesirable side effect of tribal conflict? Could machine-based tribal conflict also lead to self-extinction? Could this be another factor in explaining Fermi's Paradox?

Fermi’s Paradox - If the universe is just chock full of intelligent beings, why do we not see any evidence of their existence?

In Why Do Carbon-Based Intelligences Always Seem to Snuff Themselves Out?, I suggested that the Darwinian processes of inheritance, innovation and natural selection require several billion years of theft and murder to bring forth a carbon-based form of Intelligence and that carbon-based Intelligences do not seem to be able to turn off the theft and murder in time to save themselves from self-extinction. The tribal conflict that arises from the eusocial behavior of carbon-based Intelligences certainly adds to this tendency for self-destruction. Could the same happen to eusocial machine-based Inteligences too? In Cloud Computing and the Coming Software Mass Extinction and The Origin and Evolution of Cloud Computing - Software Moves From the Sea to the Land and Back Again, we saw that the SaaS (Software as a Service) level of the Cloud Computing Platform is currently producing applications that are composed of a large collection of Cloud Microservices that team together like the workers in a eusocial ant colony.

Figure 17 – The Cloud Microservices in the SaaS (Software as a Service) level of the Cloud Computing Platform already team together like the workers in a eusocial ant colony to produce modern applications. Could this eusocial behavior of software be a forwarning of eusocial tribal behavior by future machine-based Intelligences?

That would certainly explain why we do not currently find ourselves knee-deep in self-replicating von Neumann probes stuffed with alien software.

Comments are welcome at scj333@sbcglobal.net

To see all posts on softwarephysics in reverse order go to:
https://softwarephysics.blogspot.com/

Regards,
Steve Johnston

Monday, October 04, 2021

How Much Does Your Software Weigh? - the Equivalence of Mass, Energy and Information

Is software real? By that, I mean is software a real tangible thing subject to physical laws? Coming to grips with the idea that software really is a real thing that is subject to physical law is one of the hurdles that most IT professionals must face when first introduced to softwarephysics. In Is Information Real?, I explained that all forms of Information are real tangible things because Einstein's Special Theory of Relativity (1905) requires that Information cannot travel faster than the speed of light through spacetime in order to preserve causality. This is the same limitation that the Special Theory of Relativity places on all forms of matter and energy. Since software is just a form of Information, software must also be a real thing too. Software must be just as real as matter and energy.

However, in this posting, I would like to cover some breakthrough work by Dr. Melvin Vopson of the University of Portsmouth that further supports the reality of Information and software. In the paper below, Dr. Melvin Vopson proposes that all forms of Information have mass and also store an equivalent amount of energy equal to that mass.

The mass-energy-information equivalence principle
https://aip.scitation.org/doi/10.1063/1.5123794

In 1905, Albert Einstein established the equivalence of mass and energy with his Special Theory of Relativity with the famous equation:

E = m c2

where:
E = energy in Joules
m = the mass in kilograms
c = the speed of light 3 x 108 meters/second

You can read an English translation of Einstein's famous On the Electrodynamics of Moving Bodies in which he first introduced the world to the Special Theory of Relativity at:

http://www.fourmilab.ch/etexts/einstein/specrel/www/

The first few sections are very enlightening and not that difficult.

Similarly, in the above paper Dr. Melvin Vopson extends Einstein's equivalence of mass and energy with an equivalence of mass, energy and Information! In the paper, Dr. Melvin Vopson derives the mass of a single bit of Information as:

mbit = kb T ln(2) / c2

where:
m = the mass of one bit of Information in kilograms
kb = Boltzmann's constant = 1.38064852 × 10-23 Joules/oK
T = the temperature of your software in absolute oK = 300 oK for a room temperature of 80 oF
ln(2) = 0.6931471806
c = the speed of light 3 x 108 meters/second

This comes to a value of 3.19 x 10-38 kilogram/bit at a room temperature of 300 oK (80 oF). For example, since software is a form of Information let's calculate the mass of a Chrome or Edge browser viewing this posting. The Windows TaskManager program says that it takes about 500 MB of memory to do that so let's see how much 500 MB of software weighs. First, we must convert the 500 MB to bytes and then to bits:

500 MB = 500 x 1024 x 1024 x 8 = 4,194,304,000 bits

Then we can convert the 4,194,304,000 bits to kilograms by multiplying by 3.19 x 10-38 kilogram/bit to get a value of 1.34 x 10-28 kilograms. Since the mass of a single electron is 9.10938356 × 10-31 kilograms, that means that the mass of 500 MB of software is equal to about the mass of 147 electrons! Now the mass of 500 MB of software being equal to the mass of 147 electrons has nothing to do with the physical mass of the medium storing the software. The mass equal to 147 electrons comes from the mass of the Information itself and adds to the mass of the medium storing the running 500 MB of software. In fact, in the above paper, Dr. Melvin Vopson proposes that his mass-energy-information equivalence principle could be tested by comparing the mass of a blank 1 TB disk drive and a 1 TB disk drive that is completely full of data. The mass of the 1 TB disk drive should increase by about 2.5 x 10-25 kilograms. Measuring such a small increase in mass would be challenging, but Dr. Melvin Vopson suggests that an apparatus using a very sensitive interferometer like the LIGO device that measures gravity waves might do the trick.

Could Dark Matter Simply Be Made of Information?
The term "Dark Matter" was first coined by Swiss astronomer Fritz Zwicky in 1933. Zwicky was observing the galaxies at the periphery of the Coma galactic cluster and calculated that the galaxies on the edge of the cluster were orbiting the cluster much too quickly. Zwicky suggested that the mass inside of the Coma Cluster had to be much greater than the mass of the galaxies found within the Coma Cluster. He called it Dunkle Materie or Dark Matter. Similarly, in the 1970s Vera Rubin made a similar observation by observing galactic rotation curves.

Figure 1 – In the 1970s, Vera Rubin observed that the stars near the edges of galaxies were orbiting the centers of their galaxies much too quickly. The orbital velocity of the stars orbiting a galaxy should drop off like the orbital velocities of the planets about our Sun. But Vera Rubin found that the orbital velocities of galactic stars remained fairly constant as she moved away from the center of the galaxy. It seemed that the galactic stars were embedded in a much larger body of Dark Matter.

Figure 2 – Additional evidence for Dark Matter comes from observations of gravitational lensing of galaxies behind massive galactic clusters. For example, above we see the gravitational lensing of several galaxies that are behind the Abell 2218 galactic cluster. The lensed galaxies behind the cluster appear as curved streaks in the above photograph. Given the curvature of the lensed galaxies, Einstein's General Theory of Relativity (1915) predicts that the Abell 2218 galactic cluster should contain much more mass than can be seen in the cluster. The extra mass must come from Dark Matter.

Additionally, researchers who are trying to simulate the early evolution of our Universe with software seem to require Dark Matter in order to produce computer simulations that look like the Universe does today.

Figure 3 – The estimates vary, but today most cosmologists think that our Universe is composed of about 74% vacuum Dark Energy, 21% Dark Matter and only 5% of the Normal Matter we are familiar with - atoms composed of protons, neutrons and electrons bathed in a sea of very light neutrinos.

People have been searching for this elusive Dark Matter for over 40 years to no avail. People have been building huge detectors filled with all sorts of liquids in hopes of capturing some Dark Matter particles as they pass through and people have been looking for the creation of Dark Matter particles at the LHC at CERN. But none have been found.

In the daring preprint paper below, Dr. Melvin Vopson proposes that the missing Dark Matter that is responsible for about 80% of the matter in our Universe might just arise from the Information being stored by the 20% of matter that we call Normal Matter composed of atoms containing protons, neutrons and electrons that are all bathed in a sea of very light neutrinos!

The information content of the Universe and the implications for the missing Dark Matter https://www.researchgate.net/publication/333751969_The_information_content_of_the_Universe_and_the_implications_for_the_missing_Dark_Matter

In the abstract for this paper Dr. Melvin Vopson proposes:

In this framework, it is shown that all the informational content of the baryonic matter in the Universe could be stored in bits of information, each having a mass of 2.91 x 10-40 Kg at the average 2.73 K temperature of the Universe. It is estimated that around 52 x 1093 bits would be sufficient to account for all the missing Dark Matter in the visible Universe.

Now 52 x 1093 bits of Information is a lot of Information but in Is the Universe a Quantum Computer? we saw that some researchers, like Seth Lloyd at MIT, think of our Universe as a vast quantum computer calculating how to behave and that would certainly require a good deal of Information. This is a fascinating paper that all IT professionals should read. Just imagine, as you produce software on a daily basis you just might also be creating additional Dark Matter to help run our Universe.

The Derivation of Dr. Melvin Vopson's Equation
If proven true, Dr. Melvin Vopson's equation for the mass of Information may one day be as famous as Einstein's E = m c2 so you might be interested in how he derived it. It all started back in 1961 when Rolf Landauer was working at IBM on the minimum amount of energy required to do computations. Surprisingly, he discovered that the problem was not with creating bits of information, the problem was with erasing bits of information because erasing bits of information in the real world is an irreversible process. Recall that the second law of thermodynamics requires that the total entropy Stot of the Universe must always increase whenever an irreversible process is performed. For more on the second law of thermodynamics and irreversible processes see Entropy - the Bane of Programmers and The Demon of Software. Rolf Landauer showed that whenever a bit of Information was erased the following amount of energy had to be released in the form of heat.

ΔQ = kb T ln(2)

where:
kb = Boltzmann's constant = 1.38064852 × 10-23 Joules/oK
T = the temperature of the bit that is erased in absolute oK
ln(2) = 0.6931471806

This implied that Information was a real tangible thing and has become known as Landauer's principle. For more on that see the Wikipedia article:

Landauer's principle
https://en.wikipedia.org/wiki/Landauer%27s_principle

Rolf Landauer used the concepts of Information and entropy developed by Claude Shannon in 1948 to do this. For a more detailed explanation of Claude Shannon's concepts of Information and entropy see Some More Information About Information. Recall that in Quantum Computing and the Many-Worlds Interpretation of Quantum Mechanics I explained how Hugh Everett also used similar ideas borrowed from Claude Shannon's concepts of Information and entropy in his original 137-page Jan 1956 draft Ph.D. thesis to develop the Many-Worlds Interpretation of quantum mechanics that is now so popular with the people working on developing quantum computers. Simply stated, Claude Shannon suggested that the amount of Information or entropy Sinfo in a message was equal to the amount of "surprise" in the message. For example, a digital message containing all 0s or all 1s had no surprise in it at all and thus also carried no Information or entropy Sinfo at all. However, a digital message containing a nice mixture of 0s and 1s did contain some surprise because you never could tell what the next bit would bring. Such a message contained Information and the amount of Information in the message could actually be calculated if you knew the probabilities of finding a 0 or 1 in the message. In general, the number of 0s and 1s in a long digital message consisting of many binary bits will be about the same. That means that the probability that a particular bit will be a 0 or a 1 are both about 50%. In such a case, Claude Shannon calculated that the entropy of the message would be:

Sinfo = N kb ln(2)

where:
N = the number of binary bits in the message
kb = Boltzmann's constant = 1.38064852 × 10-23 Joules/oK
ln(2) = 0.6931471806

When we write a single bit N = 1 and we then obtain:

ΔSbit = kb ln(2)

Figure 4 – This figure is from Dr. Melvin Vopson's paper and shows that when you erase the byte for the letter "i" as is shown in a) by setting all of the bits to 1 as is shown in b) or to 0 as is shown in d) you set the Information content, also known as the information entropy Sinfo of the byte, to zero because there no longer is any "surprise" in the byte. The same thing is achieved if you physically destroy the byte by burning the byte in a hot fire that demagnetizes the device storing the byte so that the byte does not contain any 1s or 0s at all as shown in c).

So when you apply an irreversible erase process to a bit of Information, you reduce the information entropy Sinfo of the bit to zero:

Sinfo = 0

Consequently, the irreversible erase process causes a change of ΔSinfo which is the difference between the final and initial entropies:

ΔSinfo = Sfinfo - Siinfo

But Sfinfo = 0 after we erase the bit so we end up with:

ΔSinfo = 0 - Siinfo = - Siinfo

Thus, the irreversible erase process produces a negative information entropy of - Siinfo. But the second law of thermodynamics demands that the total entropy Stot of the Universe can never decrease for an irreversible process, so ΔStot must be ≥ 0. That means that the physical entropy Sphysical of the device storing the bit must also be ≥ 0.

ΔStot = ΔSphysical + ΔSinfo ≥ 0
ΔStot = ΔSphysical - Siinfo ≥ 0
ΔSphysical = ΔStot + Siinfo ≥ 0
Thus ΔSphysical ≥ 0

The second law of thermodynamics states that:

ΔS = ΔQ / T

which means that the change in entropy of an isolated system ΔS is equal to the heat flow ΔQ into or out of the isolated system divided by the absolute temperature T of the isolated system. Rearranging terms we then have:

ΔQ = ΔS T

Above we showed that when we erase a bit of information the physical entropy of the storage device must increase ΔSphysical ≥ 0. Substituting that fact into the formula for the second law of thermodynamics we see that that ΔQ / T ≥ 0 which means that ΔQ ≥ 0 and that some heat energy must leave the physical recording device when the bit of Information is erased. Using the above equations:

ΔQ = ΔS T
ΔSbit = kb ln(2)

We finally obtain Rolf Landauer's famous equation for the heat energy released when a binary bit is erased:

ΔQ = kb T ln(2)

Initially, there was some resistance to Landauer's principle and his contention that Information was a physically real thing. But amazingly, beginning in 2012 and continuing on over the years, several research teams have actually measured the amount of heat energy that is released when a single bit of Information was erased and found that Landauer's principle is actually true. This means that the concept of Information as an actual real physical thing has merit and that it is not simply something that we made up. Erasing one bit of Information at a room temperature of 300 oK (80 oF) yields 4.142 x 10-21 Joules of heat energy.

But in Dr. Melvin Vopson's paper, he wonders how this 4.142 x 10-21 Joules of energy is stored when the bit of Information is first created. Since the 4.142 x 10-21 Joules of energy can persist indefinitely until the bit of Information is erased, it has to be somewhere. Dr. Melvin Vopson proposes that the energy must be stored as an increase in mass-energy.

ΔEm = ΔQ = kb T ln(2)

From Einstein's famous equation we have:

ΔE = Δm c2
Δm = ΔE / c2

Substituting the mass-energy of the written bit ΔE = kb T ln(2) into the above equation we finally get Dr. Melvin Vopson's equation:

Δm = kb T ln(2) / c2

Figure 5 – This figure is also from Dr. Melvin Vopson's paper and is a pictorial explanation of the above. It shows that when a bit is erased its mass mbit goes to zero and ΔQ = kb T ln(2) of heat energy is released. When either a 0 or a 1 is written to the bit, its mass becomes mbit = kb T ln(2) / c2.

Figure 6 – This figure is also from Dr. Melvin Vopson's paper and shows how Einstein's mass-energy equivalence found in E = m c2 can be replaced by a mass-energy-information triad equivalence relationship.

Conclusion
Dr. Melvin Vopson's mass-energy-information equivalence principle is a very exciting proposition. I certainly hope that one day it can be experimentally validated. One can only wonder if it also applies to the Dark Energy that makes up most of our Universe today. Could the Dark Energy just be the Dark Information of the vacuum energy of the Universe? For me, such thoughts add weight to my hunch that the Universe and Multiverse are forms of self-replicating mathematical information as I outlined in What’s It All About? and What's It All About Again?.

Comments are welcome at scj333@sbcglobal.net

To see all posts on softwarephysics in reverse order go to:
https://softwarephysics.blogspot.com/

Regards,
Steve Johnston