Friday, October 26, 2007

Entropy - the Bane of Programmers

In my last post we traced the early history of the science of steam engine building as a progression of mysterious fluids. We began with Johann Joachim Becher’s 1667 mysterious phlogiston which was replaced in 1783 by Lavoisier’s mysterious caloric, and we ended with a mention of Rudolph Clausius’ mysterious fluid called entropy in 1850. Today we will carry on with entropy. As an IT professional, you might find all these mysterious fluids rather amusing, but I would like to remind you that you spend all day working with another mysterious fluid we call information. Information flows through our computer systems on a 24x7 basis, and when it stops flowing, we all get into a lot of trouble rather quickly. When I left United Airlines about 5 years ago, we lost about $150/second when went down because customers could not book flights. And I bet that some of you IT folks on Wall Street can easily lose $100,000/second without much trouble. So let’s take our mysterious fluids seriously. The people working on steam engines in the 18th and 19th centuries certainly did. By the way, have you ever stopped to wonder what information is? I mean, you work with it all day long – right? Strange as it might sound, we will soon see how the struggles of steam engine designers in the 19th century led to the concept of information in physics, so please be patient and try to continue learning things from our counterparts in the Industrial Revolution.

In 1842, Julius Robert Mayer unknowingly published the first law of thermodynamics in the May issue of Annalen der Chemie und Pharmacie using experimental results done earlier in France. In this paper, Mayer was the first to propose that there was a mechanical equivalent of heat. Imagine two cylinders containing equal amounts of air. One cylinder has a heavy movable piston supported by the pressure of the confined air, while the other cylinder is completely closed off. Now heat both cylinders. The French found that the cylinder with the movable piston had to be heated more than the closed off cylinder to raise the temperature of the air in both cylinders by some identical value like 10 0F. Mayer proposed that some of the heat in the cylinder with the movable piston was converted into mechanical work to lift the heavy piston, and that was why it took more heat to raise the temperature of the air in that cylinder by 10 0F. This is what happens in the cylinders of your car when burning gasoline vapors cause the air in the cylinders to expand and push the pistons down during the power stroke. Mayer proposed there was a new mysterious fluid at work that we now call energy, and that it was conserved. That means that energy can change forms, but that it cannot be created nor destroyed. So for the cylinder with the heavy movable piston, chemical energy in coal is released when it is burned and transformed into heat energy; the resulting heat energy then causes the air in the cylinder to expand, which then lifts the heavy piston. The end result is that some of the heat energy is converted into mechanical energy. Now you might think that with Mayer’s findings that, at long last, steam engine designers finally had some idea of what was going on in steam engines! Not quite. Mayer’s idea of a mechanical equivalent of heat was not well received at the time because Mayer was a medical doctor and considered an outsider of little significance by the scientific community of the day. The sudden loss of two of his children in 1848 and the rejection of his ideas by some of the most prestigious physicists of the time led Mayer to an attempted suicide on May 18, 1850, after which Mayer was committed to an insane asylum.

About the same time another outsider, James Prescott Joule, was doing similar experiments. Joule was the manager of a brewery and an amateur scientist on the side. Joule was investigating the possibility of replacing the steam engines in his brewery with the newly invented electric motor. This investigation ended when Joule discovered that it took about five pounds of battery zinc to do the same work as a single pound of coal. But during these experiments, Joule came to the conclusion that there was an equivalence between the heat produced by an electrical current in a wire, like in a toaster, and the work done by an electrical current in a motor. In 1843, he presented a paper to the British Association for the Advancement of Science in which he announced that it took 838 ft-lbs of mechanical work to raise the temperature of a pound of water by 1 0F (1 Btu). In 1845, Joule presented another paper, On the Mechanical Equivalent of Heat, to the same association in which he described his most famous experiment. Joule used a falling weight connected to a paddle-wheel in an insulated bucket of water via a series of ropes and pulleys to stir the water in the bucket. He then measured the temperature rise of the water as the weight, suspended by a rope connected to the paddle-wheel, descended and stirred the water. The temperature rise of the water was then used to calculate the mechanical equivalent of heat that equated BTUs of heat to ft-lbs of mechanical work. But as with Mayer, Joule’s work was entirely ignored by the physicists of the day. You see, Mayer and Joule were both outsiders challenging the accepted caloric theory of heat.

All this changed in 1847 when physicist Hermann Helmholtz published On the Conservation of Force in which he referred to the work of both Mayer and Joule and proposed that heat and mechanical work were both forms of the same conserved force we now call energy. Finally, in 1850, Rudolph Clausius formalized this idea as the first law of thermodynamics in On the Moving Force of Heat and the Laws of Heat which may be Deduced Therefrom.

The modern statement of the first law of thermodynamics reads as:

dU = dQ – dW

This equation simply states that a change in the internal energy (dU) of a closed system, like a steam engine, is equal to the amount of heat flowing into the system (dQ), minus the amount of mechanical work that the system performs (dW). Thus steam engines take in heat energy from burning coal and convert some of the heat energy into mechanical energy, exhausting the rest as waste heat.

Remember how Carnot thought that the efficiency of a steam engine only depended upon the temperature difference between the steam from the boiler and the temperature of the room in which the steam engine was running? Carnot believed that when caloric fell through this temperature difference, it produced useful mechanical work. Clausius reformulated this idea with a new concept he called entropy, described in his second law of thermodynamics. Clausius reasoned that there had to be some difference in the quality of different energies. For example, there is a great deal of heat energy in the air in the room you are currently sitting in. According to the first law of thermodynamics, it would be possible to build an engine that converts the heat energy in the air into useful mechanical energy and exhausts cold air out the back. With such an engine, you could easily build a refrigerator that produced electricity for your home and cooled your food for free at the same time. All you would need to do would be to hook up an electrical generator to the engine, and then run the cold exhaust from the engine into an insulated compartment! Clearly this is impossible. As Clausius phrased it, "Heat cannot of itself pass from a colder to a hotter body".

The second law of thermodynamics can be expressed in many ways. One of the most useful goes back to Carnot’s original idea of the maximum efficiency of an engine and the temperature differences between the steam and room temperature. It can be framed quantitatively as:

Maximum efficiency of an engine = 1 – Tc/Th

where Tc is the temperature of the cold reservoir into which heat is dissipated and Th is the temperature of the hot reservoir from which heat is obtained. For a steam engine, Th corresponds to the temperature of the steam and Tc to the temperature of the surrounding room, with both temperatures measured in absolute degrees Kelvin. So Carnot’s proposal was correct. As the temperature difference between Tc and Th gets larger, Tc/Th gets smaller and the efficiency of the engine increases and approaches the value “1” or 100%. In this form of the second law, entropy represents the amount of useless energy; energy that cannot be turned into useful mechanical work and remains as useless waste heat.

Later, in 1865, Clausius presented the most infamous version of the second law at the Philosophical Society of Zurich as:

The entropy of the universe tends to a maximum.

What he meant here was that spontaneous changes tend to smooth out differences in temperature, pressure, and density. Hot objects cool off, tires under pressure leak air, and the cream in your coffee will stir itself if you are patient enough. A car will spontaneously become a pile of rust, but a pile of rust will never spontaneously become a car. Spontaneous changes cause an increase in entropy and entropy is just a measure of the degree of this smoothing-out process. So as the entropy of the universe constantly increases with each spontaneous change - the universe tends to run down hill with time. Later we will see that entropy is also a measure of the disorder of a system at the molecular level. In a sense, Murphy’s law is just a popular expression of the second law.

The first and second laws of thermodynamics laid the foundation of thermodynamics which describes the bulk properties of matter and energy. The science of thermodynamics finally allowed steam engine designers to understand what was going on in the cylinders of a steam engine by relating the pressures, temperatures, volumes, and energy flows of the steam in the cylinders while the engine was running. This ended the development of steam engines by trial and error, and the technological craft of steam engine building finally matured into a science.

In softwarephysics we apply these same basic ideas of thermodynamics to the macroscopic behavior of software at the program level. The macroscopic behavior of a program can be viewed as the functions the program performs, the speed with which those functions are performed, and the stability and reliability of its performance, just as the macroscopic behavior of steam in a steam engine can be defined by pressure, temperature, and volume changes. This is the viewpoint of software from the perspective of IT management and end-users. They don’t really care what is going on inside of software; they are only interested in how software behaves. From this perspective, it is well known that the entropy of software tends to spontaneously increase with time; software tends to run down hill. Whenever we change software, there is a very good chance that things will run amuck; that is why IT has developed such elaborate change management procedures. But even when we don’t change software, we still seem to get into trouble. I began the very first post on softwarephysics with the following paragraph:

Have you ever wondered why your IT job is so difficult? Have you ever noticed that whenever we change software, performance can drastically decline? Have you observed that performance can drastically decline even when we don’t change software; that sometimes applications spontaneously get slow and then spontaneously return to normal response times without any intervention? Have you noticed that 50% of the time we never find a root cause for problems, and that we just start bouncing things at random until performance improves? Have you ever wondered why software behaves this way? Is there anything we can do about all this?

My contention is that these observed behaviors of software are due in part to a simulation of the second law of thermodynamics and the natural tendency for the entropy of software to increase with time at the program level. For deeper insights into this phenomenon of software behavior, we will need to drill down to a lower level and turn to another effective theory of physics called statistical mechanics. Statistical mechanics was developed during the last half of the 19th century and allows us to derive the thermodynamic laws of matter and energy, outlined above, by viewing matter as a large collection of molecules in constant random motion.

Comments are welcome at

To see all posts on softwarephysics in reverse order go to:

Steve Johnston

Friday, October 19, 2007

Computer Science as a Technological Craft

A few posts back I described how James Watt made improvements to the Newcomen Steam engine that raised the energy efficiency of steam engines from 1% to 3% and in the process sparked the Industrial Revolution. The reason for exploring the history of steam engines from an IT perspective is two-fold. First of all, it will lead us to some very applicable concepts in the form of the second law of thermodynamics and the interplay of entropy (disorder) and information at the coding level; secondly it is an interesting story of a technological craft developing into a science. A technological craft is a collection of skills and techniques gained through trial and error that is passed down through the generations. A prime example is early metallurgy. People began to smelt copper about 3800 B.C. in Iran. A thousand years later, in 2800 B.C., the Sumerians in Iraq learned how to combine copper and tin into a very hard and durable alloy called bronze - the distinguishing hallmark of civilization. Around 1500 B.C. the Hittites began to work with iron, which was softer than bronze at the onset, but which made possible the discovery in 1000 B.C. that when iron was reheated with charcoal, it formed a very hard alloy we now call steel. Over many thousands of years of trial and error, people learned many useful metallurgical techniques, like how to harden metal by quenching hot metal in water and then to temper the brittle quenched metal by reheating it and allowing the metal to slowly anneal. Over thousands of years, mankind developed many impressive metallurgical skills and practices, but during all this time, nobody really had any idea of what was going on in the metal during all these intricate manipulations. Acquiring technology by trial and error is a very slow process indeed. Today, metallurgy has evolved into a branch of the material sciences and studies metals at the atomic level within crystal lattices. A modern metallurgist might use x-ray diffraction patterns, electron microscopy, neutron bombardment, or a mass spectrometer to figure out what is going on in a new alloy under study.

Although IT and computer science are populated by very intelligent and gifted people, I would like to suggest that computer science is still an emerging technological craft on the threshold of becoming a true science. In the current state of affairs, even software engineering is little more than the formal instructional framework of a guild. It outlines the procedures to follow to develop and maintain software without dealing with the ultimate nature of software itself. It’s like describing the formal procedures to heat treat the blade of a sword without going into the nature of the steel in the blade itself. In the modern world, civil engineers would find it very difficult indeed to design bridges without knowledge of the tensile strength of structural steel. That is where softwarephysics can be of help to software engineers by providing a theory of software behavior, just as physics comes to the aid of civil engineers by offering a model for the behavior of steel under load.

In 1979, when I transitioned from being an exploration geophysicist to become an IT professional, I left a diverse exploration team exploring for oil in the Gulf of Suez to become a programmer in Amoco’s IT department. Exploration teams are a multidisciplinary team consisting of geologists, geophysicists, petrophysicists, geochemists, and paleontologists. Oil companies throw all the science they can muster at trying to figure out what is going on in a prospective basin before they start spending lots of money drilling holes, just as it is prudent to try to understand the internal nature of an alloy before trying to improve it. When I moved into Amoco’s IT department, I came into contact with many talented and intelligent people, but I was dismayed to discover that there was little sharing of ideas from the other sciences like I had found in my old exploration teams. It seemed that computer science was totally isolated from the outside scientific world. I realized that computer science was a very young science at the time, and that it was more of a technological craft than a science, but that was nearly 30 years ago! It’s time for computer science to learn from the other sciences! Fortunately, we are beginning to see this in computer science with the Biologically Inspired Computing (BIC) community in the computer science departments of many universities. The BIC community is, at long last, bringing in ideas from biology, physics, and biochemistry into mainstream computer science and ultimately IT.

Now back to steam engines. Watt did not know that he had increased the energy efficiency of steam engines because he had never even heard of the term energy. The concept of energy did not arrive on the scene until 1850 when Rudolph Clausius published the first law of thermodynamics. But Watt did know that his engine used about 1/3 the coal of a standard Newcomen steam engine with the same horsepower. Unfortunately, the science of the day was not of great help to 18th century steam engine designers. They had to deal with some very poor effective theories at the time. There was a lot of confusion about the nature of heat in those days. In 1667, Johann Joachim Becher published the phlogiston theory of combustion. The phlogiston theory was an effective theory that proposed that flammable substances such as coal contained a mysterious substance called phlogiston. When coal burned, it released phlogiston to the air. The ash that remained behind was the residual "dephlogisticated" form of coal, while the fumes from the burning coal were "phlogisticated air". Air could only hold so much phlogiston before it became saturated, and that was why a candle could be snuffed out by an overturned glass. The phlogiston theory of combustion held sway until it was replaced by the caloric theory of heat by Antoine Lavoisier in 1783. Lavoisier proposed that when coal burned, it really combined with the newly discovered gas found in air called oxygen and released another mysterious substance called caloric. Caloric was the “substance of heat” and always flowed from hot bodies to cold bodies. Naturally, the hot flue gasses from the burning coal expanded as they took up caloric, making hot air balloons possible. Now during all this time, Daniel Bernoulli had proposed an effective theory of heat that is still in force today known as the kinetic theory of heat. In 1738, Bernoulli proposed that gasses are really composed of a very large number of molecules bouncing around in all directions. Gas pressure in a cylinder was simply the result of a huge number of molecular impacts from individual gas molecules against the walls of a cylinder, and heat was just a measure of the kinetic energy of the molecules bouncing around in the cylinder. But the kinetic theory of heat was not held in favor in Watt’s day, and unfortunately for early steam engine designers, none of these theories of heat related heat energy to mechanical energy, so Watt and the other 18th century steam engine designers were pretty much on their own, without much help from the available science of the day. In the absence of a useful scientific model for energy, steam engine building in the 18th century became a technological craft, just as computer science has become a technological craft lacking a practical model for software behavior.

This all changed in 1824 when Sadi Carnot published a small book entitled Reflections on the Motive Power of Fire, which was the first scientific treatment of steam engines. In this book, Carnot asked lots of questions. Carnot wondered if a lump of coal could do an infinite amount of work. “Is the potential work available from a heat source potentially unbounded?" . He also wondered if the material that the steam engine was made of or the fluid used by the engine made any difference. "Can heat engines be in principle improved by replacing the steam by some other working fluid or gas?". And he came to some very powerful conclusions. Carnot proposed that the efficiency of a steam engine only depended upon the difference between the temperature of the steam used by the engine and the temperature of the room in which the steam engine was running. He envisioned a steam engine as a sort of caloric waterfall, with the difference between the temperature of the steam from the boiler and the temperature of the room the steam engine was in being the height of the waterfall. Useful work was accomplished by the steam engine as caloric fell from the high temperature of the steam to the lower temperature of the room, like water falling over a waterfall doing useful work on a paddlewheel. It did not matter what the steam engine was made of or what fluid was used.

"The motive power of heat is independent of the agents employed to realize it; its quantity is fixed solely by the temperatures of the bodies between which is effected, finally, the transfer of caloric."

“In the fall of caloric the motive power evidently increases with the difference of temperature between the warm and cold bodies, but we do not know whether it is proportional to this difference.”

“The production of motive power is then due in steam engines not to actual consumption of the caloric but to its transportation from a warm body to a cold body.”

Later we will see that Carnot’s ideas were really an early expression of the second law of thermodynamics.

Although we no longer have much confidence in the caloric theory of heat, and now favor the kinetic theory of heat in its place, Carnot was able to make a great contribution to the science of steam engines and thermodynamics by creating an abstract model of steam engines that provided a direction for thought. However, Reflections on the Motive Power of Fire created little attention in 1824 and quickly went out of print. Carnot’s ideas were later revived by work done by Lord Kelvin in 1848 and by Rudolph Clausius in 1850, when Clausius published the first and second laws of thermodynamics in On the Moving Force of Heat and the Laws of Heat which may be Deduced Therefrom. In a similar manner, softwarephysics attempts to provide an abstract model for the behavior of software that provides a direction for thought.

So what happened to Sadi Carnot? In 1832, Carnot died in a cholera epidemic at the age of 36, a victim of the miasma theory of disease described in an earlier post. Unfortunately, because of the fear of choleric miasma, many of his writings were also buried along with him, leaving only a few additional surviving scientific writings.

Next time we will continue on to define the first and second laws of thermodynamics and encounter another strange mysterious substance called entropy – the true bane of all programmers.

Comments are welcome at

To see all posts on softwarephysics in reverse order go to:

Steve Johnston

Thursday, October 04, 2007

So Why Are There No Softwarephysicists?

I started working on softwarephysics in 1979 and I have struggled with this question for many years. In my last post, I proposed the idea of thinking of both software and money as being virtual substances. This brought to mind an old high school experience of mine from 1969. During the last semester of my senior year, while I and all of my fellow classmates were well established in our well deserved senior slump, I came across one of those teachers who remain with you for the rest of your life. In this very last semester of high school, we all had to take a mandatory course in economics, in which, from the onset, I totally had no interest, having already mentally departed for the University of Illinois to study physics. However, to my surprise, this teacher totally won me over on the very first day of class with an introduction to the course that went something like this:

We have had money in circulation for several thousand years. We have had domestic and international trade for several thousand years. We have had governments collecting taxes, tariffs, and tribute for several thousand years. We have had banks and money lending for several thousand years. Writing and mathematics have been with us for several thousand years too, and they were invented primarily so that we could do accounting, which has also been with us for several thousand years. And all of these things had life or death consequences for the people of the time. Wars were fought over these issues. Kingdoms and civilizations rose and fell over these issues. All of human history was shaped by these issues. So the question is, dear student, how come there were no economists until the 18th century? Everything that modern economists study and work with had been around for several thousand years, and yet there were no economists! Why? British economist Adam Smith is credited with inventing economics when he published The Wealth of Nations in 1776. Why did that take several thousand years? What changed? What changed was a way of thinking brought about by Galileo, Des Cartes, Spinoza, and Newton. As Thomas Paine put it, it was The Age of Reason. The Enlightenment of the 18th century brought on by the Scientific Revolution of the 17th century created a worldview capable of contemplating economic theories. In this worldview, the Universe was at last understandable and rational - it followed physical laws and so too could economic activities.

It has been 66 years since the Age of Software began in the spring of 1941 when Konrad Zuse completed his Z3 computer. But in all that time, I have never seen a theory for software behavior, outside of softwarephysics, that depicted software with invariant tangible physical properties and a set of underlying principles or laws that govern those properties. I have never seen an economic theory of software. Yet deep down I am convinced that nearly all IT professionals really do think of software as a virtual substance. At times we curse software, at other times we cajole software, but at all times we are obsessed with software. I started programming in the fall of 1972 when I took the obligatory course in FORTRAN programming that nearly all physics majors took at the time, and ever since it has been the same story no matter where I go or what I do. I have programmed on punched cards, punched tape, magnetic tape and disk drives using no editor, line editors, full-screen editors, and CASE editors. I have used 3rd generation languages, 4th generation languages, CASE languages, compiled languages and interpreted languages and it has always been predictably the same. When I start working on some code, it’s like nothing has really changed in 35 years. It’s all the same problems over and over, day in and day out. In physics, we would say that software is homogeneous, isotropic, and time invariant. That means that software is the same no matter where you go, where you look, or where you find yourself in time. It’s always the same thing over and over.

Such symmetries have profound implications in modern physics. Emmy Noether was a brilliant mathematical genius, who like Einstein before her, fled Nazi Germany in 1933. In 1918, she published Noether’s Theorem which has become a fundamental tenet in theoretical physics. Her theorem states that there is a one-to-one correspondence between the conservation laws and the symmetries of nature. For example, let’s suppose you do a simple high school physics experiment like colliding two billiard balls on a pool table. Now move the whole contraption 100 miles east and do the same exact experiment. You get the same results. That means the results are symmetric under a spatial translation. Noether showed that this translational symmetry implied the law of the conservation of momentum – the bad stuff that happens when you run into a parked car. Now rotate the pool table 1800 and repeat the experiment. Again you get the same result. Symmetry under rotation implies the conservation of angular momentum – why a skater speeds up when she pulls in her arms in a spin. Do the same experiment a month later, and again you will get the same results. The symmetry over time implies the conservation of energy. So the fact that software is homogeneous, isotropic, and time invariant just screams out for some underlying laws at work. Something has to be going on!

Next time I really will pick up again with our steam engine designers and thermodynamics. I am a little hesitant to get deeper into physics, since there is the chance of losing some of you, given the very low level of popularity of physics with the general public. But I am going to forge ahead with as little math as possible and try to stick to the main ideas of physics from an IT perspective. My old physics department had a plaque on the wall that read “I understand the material; I just can’t do the problems”. I have frequently thought a more fitting plaque would have been “I can do the problems; I just don’t understand the material.” In my opinion, one of the major failings of teaching physics in this country has been an over emphasis on problem solving. After all, most physics students never end up being professional physicists, but the underlying concepts of physics can be understood by most people and can be used by all on a daily basis to improve their lives.

So where are all the softwarephysicsts? Why you’re one of them! You just don’t know it yet.

Comments are welcome at

To see all posts on softwarephysics in reverse order go to:

Steve Johnston