Wednesday, May 07, 2008

The Fundamental Problem of Software

This posting will be relatively short because we have already done most of the heavy lifting in previous posts. By now we have amassed all the necessary softwarephysics to finally frame the fundamental problem of software in the form of the three laws of software mayhem.

The Three Laws of Software Mayhem

1. The second law of thermodynamics tends to introduce small bugs into software that are never detected through testing.

2. Because software is inherently nonlinear these small bugs cause general havoc when they reach production.

3. But even software that is absolutely bug-free can reach a critical tipping point and cross over from linear to nonlinear behavior, with disastrous and unpredictable results, as the load on software is increased.

Again, the three laws of software mayhem are an effective theory of software behavior, in that they are just an approximation of reality over an effective range of software conditions and only provide a limited level of insight into the true nature of software behavior, but they do form a good starting point for a discussion on what can be done about the fundamental problem of software. In Entropy - the Bane of Programmers we described the second law of thermodynamics as the propensity of isolated macroscopic systems to run down or depreciate with time, as proposed by Rudolph Clausius in 1850. Clausius observed that the Universe is constantly smoothing out differences. His second law of thermodynamics proposed that spontaneous changes tend to smooth out differences in temperature, pressure, and density. Hot objects cool off, tires under pressure leak air, and the cream in your coffee will stir itself if you are patient enough. Clausius defined the term entropy to measure this amount of smoothing-out or depreciation of a macroscopic system, and with the second law of thermodynamics, proposed that entropy always increased whenever a change was made. In The Demon of Software, we drilled down deeper still and explored Ludwig Boltzmann’s statistical mechanics, developed in 1872, in which he viewed entropy from the perspective of the microstates that a large number of molecules could exist in. For any given macrostate of a gas in a cylinder, Boltzmann defined the entropy of the system in terms of the number N of microstates that could produce the observed macrostate as:

S = k ln(N)

For example, air is about 78% nitrogen, 21% oxygen and 1% other gasses. The macrostate of finding all the oxygen molecules on one side of a container and all of the nitrogen molecules on the other side has a much lower number of microstates N than the macrostate of finding the nitrogen and oxygen thoroughly mixed together, so the entropy of a uniform mixture is much greater than the entropy of finding the oxygen and nitrogen separated. We used poker to clarify these concepts with the hope that you would come to the realization that the macrostate of going broke in Las Vegas had many more microstates than the macrostate of breaking the bank at one of the casinos.

We also discussed the apparent paradox of Maxwell’s Demon and how Leon Brillouin solved the mystery with his formulation of useful information as the difference between the initial and final entropies of a system after a determination of the state of the system had been made.

∆I = Si - Sf
Si = initial entropy
Sf = final entropy

Since the second law of thermodynamics demands that the entropy of the Universe must constantly increase, it also implies that the total amount of useful information in the Universe must constantly decrease. A sobering thought for all IT professionals, since we are in the business of creating and processing useful information. Look at it this way, about 14 billion years ago our Universe began as a singularity with all space, time, matter, and energy concentrated into a single point with a dimension of zero. So our Universe began with an entropy of zero and lots of useful information because we knew exactly where everything was. But now there is stuff all over the place, and it is getting worse by the moment! Probable things, like buggy code, have lots of entropy and little useful information, while improbable things, like perfect code, have little entropy and lots of useful information. Please note that the idea of information in physics is still a little murky and there are several formulations of just exactly what information is. In softwarephysics, I exclusively use Brillouin’s approach because it seems to be the most applicable to IT.

These concepts were incorporated into softwarephysics as the propensity for software to accumulate bugs during initial development and later on during normal maintenance activities. Since the number of “buggy” versions or microstates of a piece of software is so much larger than the number of “correct” versions or microstates of a piece of software, writing perfect code is indeed similar to breaking the bank at a casino.

In Software Chaos we discussed the surprising eccentricities of nonlinear systems, first revealed through computer simulations and later validated through traditional experimental investigations, leading to the effective theories of chaos theory and complexity theory. Like softwarephysics, chaos theory and complexity theory essentially began as simulated sciences, first made possible by the advent of sophisticated computers. The key insights of chaos and complexity theory are that nonlinear systems are very sensitive to slight changes in initial conditions, which can lead to very dramatic and unpredictable results even when the nonlinear systems are totally deterministic. This sensitivity to initial conditions leads to emergent behaviors which cannot be predicted by the reductionist approach so successful for linear systems. For nonlinear systems, the whole is greater than the sum of the parts, so analyzing the separate parts in detail cannot possibly completely define the system as a whole. One has to take a more holistic approach for nonlinear systems. In a sense, the second law of thermodynamics is just such an emergent behavior. The second law of thermodynamics does not really apply to a single molecule bouncing around in a cylinder because all macrostates have a single microstate; therefore the concept of entropy has no meaning. Only as we slowly add more and more molecules to a cylinder does the second law of thermodynamics emerge. One could never derive the second law of thermodynamics by simply analyzing the collision of two molecules with each other.

As every IT professional knows, software too is very sensitive to small changes. That is why we have such elaborate and manpower intensive testing and change management procedures. Imagine the impact to your development and maintenance budgets if software always worked the very first time! Wouldn’t it be wonderful if you could just write software off the top of your head, like a letter to an old friend, with no need for debugging or testing! I remember an incident from 1973 when I and two other graduate students tried to convert 100 lines of FORTRAN into a BASIC program that we could run on our DEC PDP/8e minicomputer. We were trying to convert a 100 line FORTRAN program for an FFT (Fast Fourier Transform) that we had listed in a textbook into BASIC. Since FORTRAN and BASIC are so similar, we figured that converting the program would be a cinch, and we confidently started to punch the program into the teletype machine connected to the DEC PDP/8e using the line editor that came with the minicomputer. We were all novice programmers, each with just one class in FORTRAN programming under our belts that made us all instant experts in the new science of computer science. So being rather naïve, we thought this would be a relatively easy task because all we had to do was translate each line of FORTRAN code into a corresponding line of BASIC. We eagerly plunged in programming off the top of our heads. Because we were using a BASIC interpreter, we did not have to do any compiles, but we were quite shocked when our first BASIC FFT program abended on the very first run! I mean all we had to do was make some slight syntax changes, like asking an Englishman for the location of the nearest petrol station. Finally, after about 6 hours of intense and very frustrating labor, we finally had our BASIC FFT program spitting out the correct answer. It actually made me long for my old trusty sliderule! That was the first time I ever tried to get a computer to do something that I really wanted it to do, rather than just completing one of those pesky assignments in my FORTRAN class. I guess I mistakenly figured that the programs assigned in my FORTRAN class were especially tricky because the professor was just trying to winnow out the real losers so that they would not continue on in computer science! I was pretty young back then.

Because the first and second laws of software mayhem expose all IT professionals to a common and pervasive set of experiences, these laws have essentially been incorporated into our unconscious IT common sense, and we are largely oblivious to them. Nobody even questions the effects of the first two laws, just as people do not routinely question why things fall down. At an early age, we all learned the hard way not to jump over the railing of our cribs. But the third law of software mayhem is another thing entirely. IT professionals routinely treat the very nonlinear and highly interdependent components of a complex software architecture as a linear system that can be understood and controlled through a reductionist approach of analyzing each individual component. This is largely because the concepts of chaos and complexity theory are less than 30 years old and have not been incorporated into the Zeitgeist of our time. The idea that the whole is greater than the sum of the parts has still not penetrated our common sense worldview. The third law tells us that we need to let go of our obsession with the absolute control of the operation of software, just as aeronautical engineers had to give up their obsession with totally eliminating the effects of turbulence. Because software is nonlinear, it will always misbehave like a thunderstorm on a warm July afternoon. The best we can do is to learn how to live with software transients and minimize their effects, just as we have learned how to live with and minimize the effects of severe weather conditions.

So What Do We Do Now?
Now you might be thinking that after suffering through all this physics and softwarephysics – “That’s it? All you came up with is three crummy laws? And these laws just tell me something I already knew – that IT is incredibly difficult!”. Rest assured, I am not about to abandon you here with no hope at all. It’s just that before you can make things better, you have to figure out what the fundamental problem is first. And I don’t think we have really ever done that before in IT. As far as I can tell, nobody has ever really sat down and calmly stated what the fundamental problem of software was.

So let’s use the Equivalence Conjecture of Softwarephysics to help us out of this jam. If we look to the physical Universe, we now know that most systems are really nonlinear. Yes, sometimes we do catch these nonlinear systems behaving like linear systems, such as the planets revolving about the Sun, but that is the exception and not the rule. For example, during the first few hundred million years of the formation of our solar system, when it was forming out of a swirling cloud of dust and gas surrounding the solar protostar, chaos ruled. Jupiter was busy flinging planetesimals out of the nascent solar system to form the Pluto-like bodies of the Kuiper belt and also the icy comets of the Oort cloud, and a planetesimal about the size of Mars collided with a primitive Earth to form the double-planet system we now know as the Earth and Moon. And we all know that the second law of thermodynamics has always ruled the day from the very beginning. So the question is, are there any complex information processing systems in the physical Universe that deal well with both the second law of thermodynamics and nonlinearity? The answer to this question is of course, yes – living things do a marvelous job in this perilous Universe with contending with both the second law of thermodynamics and nonlinearity. Just as every programmer must assemble characters into lines of code, living things must assemble atoms into complex organic molecules in order to perform the functions of life, and because the physical Universe is largely nonlinear, small errors in these organic molecules can have disastrous effects. And this all has to be done in a Universe bent on degenerating into a state of maximum entropy and minimum useful information content thanks to our old friend the second law of thermodynamics. So it makes sense from an IT perspective to adopt these very successful biological techniques when developing, maintaining, and operating software, and that is what we will be focusing on next in softwarephysics.

Comments are welcome at scj333@sbcglobal.net

To see all posts on softwarephysics in reverse order go to:
https://softwarephysics.blogspot.com/

Regards,
Steve Johnston