Wednesday, September 08, 2010

When Toasters Fly

I just finished The Richness of Life- the Essential Stephen Jay Gould (2006) edited by Steven Rose. This was a rather lengthy, but highly interesting, compendium of the writings of the evolutionary biologist Stephen Jay Gould, primarily his monthly essays published in the journal Natural History. I like to follow the writings of evolutionary biologists because the evolution of life on Earth provides some very good insights into the historical evolution of software over the past 70 years and of its future possibilities. Living things and software are both forms of self-replicating information, which evolve by means of the Darwinian processes of inheritance, innovation and natural selection, so studying the evolution of one helps to explain the evolution of the other.

Gould is famous for several contributions to classical Darwinian thought, and necessarily some accompanying controversies as well. Gould’s main contention is that evolution is not progressive in nature and has no predetermined direction leading to, among other things, conscious beings like ourselves. Gould is also famous for the concept of punctuated equilibrium and the idea that natural selection may be less important than many of the other factors that influence the course of evolution over time. All of these concepts have an impact on the evolution of software as well, since as I pointed out in The Origin of Software the Origin of Life, software needs for the emergence of intelligent carbon-based life to arise first as a stepping stone to its eventual exploration of a galaxy. So if the evolution of intelligent carbon-based life has a very low probability of occurring, even on a Rare Earth such as ours, then software must be quite rare in our Universe too.

In this posting, I would like to consider from an IT perspective, some of Gould’s thoughts as they might pertain to the evolution of software. I started programming in 1972, and I have been closely following the evolution of software ever since with great fascination. Unfortunately, I missed the very first thirty years of software evolution, during the IT formative years of the Unstructured Period (1941 – 1972), but I was taught to write unstructured batch FORTRAN code on punch cards back in 1972, and I did not actually see my first structured FORTRAN program until 1975, so I did indeed get a taste of the very earliest stages of IT. Before proceeding, it might be a good idea to review the section on SoftwarePaleontology in SoftwareBiology to reacquaint yourself with the evolutionary history of software over the past 70 years.

Punctuated Equilibrium
Gould proposed the concept of punctuated equilibrium, along with Niles Eldridge, back in 1972, to resolve one of the most troublesome problems in Darwinian thought that go all the way back to On the Origin of Species (1859). The problem is that of the apparent lack of intermediate forms in the fossil record. The objection back in 1859, and even today for creationists, is that if living things really do slowly transform from one form to another over long periods of geological time via the Darwinian processes of inheritance, innovation and natural selection, why are there no fossils left behind in the fossil record of the large number of necessary intermediate steps between discrete species? The fossil record should reflect this slow change of one species into another so that it should be just as likely to find the fossils of an ancient fish, amphibian, or fish-becoming-amphibian, but that is not exactly what one finds in outcrops. Instead, one generally finds fossils of distinct species suddenly appearing out of nowhere, which may then persist seemingly unchanged for many millions of years, until they finally vanish just as quickly as they first appeared in the fossil record. Darwin attributed this lack of intermediate forms to the paucity of the fossil record itself, which might have held sway back in 1859 due to the corresponding paucity of geologists at the time, but as more and more of the Earth’s surface and subsurface geology was mapped and explored during the ensuing years, this argument grew considerably weaker. Now I must add, to the dismay of creationists and all others with little confidence in Darwinian evolution, that there really have been a large number of fossils of intermediate forms discovered over the years to support Darwin’s theory. The problem is not that there are none; the problem is that there should be more.

To address this problem, punctuated equilibrium maintains that the paucity of intermediate forms in the fossil record is not due to a paucity of strata, but to variations in the rate of evolutionary change over geological time. For Gould, the evolution of a new species that branches off from an older, already existing species in an isolated region, is a rapid event in geological terms occurring over a few thousands of years. Once a new species has developed in isolation, it can then rapidly migrate over an extended area. Since the odds that the deposition of sediments friendly to the formation of fossils took place exactly in the isolated region in which a new species first appeared and exactly during the brief period of a few thousand years in which the new species first developed is quite small, one does not generally find the intermediate forms left behind in the fossil record because the fossils of the intermediate forms were never deposited in the first place. Instead, one finds the abrupt appearance of the new species in distant strata that were deposited during the period that followed the initial migration of the new species from its point of origin. Between these brief periods of new species formation, there are very long periods of stasis, during which species hardly evolve at all, and it is during these very long periods of stasis that the bulk of fossils are deposited. So over the long haul, most living things simply exist in a business-as-usual equilibrium with their environment, predators and prey, and only on occasion do they leave behind evidence of their existence in the fossil record. Only when circumstances dramatically and abruptly change do we see new species appear on the scene in a more or less geological flash. Thus, in punctuated equilibrium, species climb Richard Dawkins’ Mount Improbable (1996) in a series of discrete steps along a staircase, rather than slowly strolling up a gently rising ramp.

The concept of punctuated equilibrium has become rather mainstream but is still not accepted by all as the complete answer. Much of the resistance to the concept stems from the historical development of geology itself. Most paleontologists either began as geologists or as biologists who wandered into the field as geological late-comers, but they all have to deal with fossils, and necessarily, the vagaries of the geological sciences. Early in the history of geology during the late 18th century, the paradigm of catastrophism ruled the day. Georges Cuvier was an early proponent who tried to explain the extinction patterns found in the fossil record as the result of a series of catastrophic events such as Noah’s flood. In catastrophism, geological formations such as mountains, canyons, river basins, and the strata seen in road cuts, are all the result of rapid catastrophic events, like volcanic eruptions, earthquakes and massive worldwide floods. In fact, the names for the modern Tertiary and Quaternary geological periods actually come from those days! In the 18th century, it was thought that the water from Noah’s flood receded in four stages - Primary, Secondary, Tertiary and Quaternary, and each stage laid down different kinds of rock as it withdrew.

Catastrophism was eventually replaced with the uniformitarianism of James Hutton and Charles Lyell in the early 19th century. In James Hutton’s Theory of the Earth (1785) and Charles Lyell’s Principles of Geology (1830), the principle of uniformitarianism was laid down. Uniformitarianism contends that the Earth has been shaped by slow-acting geological processes that can still be observed at work today - the “present is key to the past”. If you want to figure out how a 100 million-year-old cross-bedded sandstone came to be, just dig into a point bar on a modern-day river and take a look. Now since most paleontologists are really geologists who have specialized in studying fossils, the idea of uniformitarianism unconsciously crept into paleontology as well. Because uniformitarianism proposed that the rock formations of the Earth slowly changed over immense periods of time, so too must the Earth’s biosphere have slowly changed over this same long period of time. Uniformitarianism may be very good for describing the slow evolution of hard-as-nails rocks, but maybe it is not so good for the evolution of squishy living things that are much more sensitive to environmental changes, and consequently, must quickly adapt to new conditions when they arise in order to survive. Yes, uniformitarianism may have been the general rule for the biosphere throughout most of geological time, as the Darwinian mechanisms of innovation and natural selection slowly worked upon the creatures of the Earth, but when rapid and dramatic environmental changes took place in isolated regions, catastrophism might be a better model. But some paleontologists still subconsciously object to punctuated equilibrium because it stirs up a deep-seated aversion to any idea resembling the old catastrophism.

Exaptations and Spandrels
Gould is also famous for his concept of exaptations, the idea that nature takes advantage of pre-existing functions that evolved for one purpose but are later put to work to solve a completely different problem. As I described in Self-Replicating Information, what happens is that organisms develop a primitive function for one purpose, through small incremental changes, and then discover, through serendipity, that this new function can also be used for something completely different. This new use will then further evolve via innovation and natural selection. For example, we have all upon occasion used a screwdriver as a wood chisel in a pinch. Sure the screwdriver was meant to turn screws, but it does a much better job at chipping out wood than your fingernails, so in a pinch, it will do quite nicely. Now just imagine the Darwinian processes of inheritance, innovation and natural selection at work selecting for screwdrivers with broader and sharper blades and a butt more suitable for the blows from a hammer, and soon you will find yourself with a good wood chisel. At some distant point in the future, screwdrivers might even disappear for the want of screws, leaving all to wonder how the superbly adapted wood chisels came to be. Darwin called such things a preadaptation, but Gould did not like this terminology because it had a teleological sense to it, as if a species could consciously make preparations in advance for a future need. The term exaptation avoids such confusion.

Along these lines, Gould goes on to introduce the concept of spandrels in evolutionary biology. One of the papers in The Richness of Life- the Essential Stephen Jay Gould (2006) is a 1979 paper Gould wrote with Richard Lewontin entitled The Spandrels of San Marco and the Panglossian Paradigm. In a cathedral, the spandrels are the curved areas which exist between the arches that support the dome of the cathedral.

Figure 1 - A spandrel is a byproduct of the arches that hold up a dome (click to enlarge)

In the very beginning of this paper, Gould describes the elaborate artwork to be found within each spandrel of the San Marco cathedral. Gould explains that:

The design is so elaborate, harmonious and purposeful that we are tempted to view it as the starting point of any analysis, as the cause in some sense of the surrounding architecture.

He then goes on to explain that the artwork within each spandrel is just an opportunistic afterthought on the part of some bygone artist, and not a necessary structural element supporting the dome. So spandrels are simply a necessary byproduct of supporting a dome with arches that can be put to good use serving other purposes. In evolutionary biology, a spandrel is any biological feature that arises in a species as a necessary side effect of producing another feature, and which is not directly selected for by natural selection. Spandrels may be unnecessary baggage just along for the ride, but they can also become exaptations that evolve into something useful too. In the essay Not Necessarily A Wing, Gould goes on to show how biological spandrels can be put to good use as exaptations. He begins with a statement of the problem.

We can readily understand how complex and fully developed structures work and how their maintenance and preservation may rely upon natural selection – a wing, an eye, the resemblance of a bittern to a branch or of an insect to a stick or dead leaf. But how do you get from nothing to such an elaborate something if evolution must proceed through a long sequence of intermediate stages, each favored by natural selection? You can’t fly with 2 percent of a wing…. How, in other words, can natural selection explain the incipient stages of structures that can only be used in much more elaborated form?

Frequently, this argument is rephrased as “What good is 2 percent of an eye? You can’t see with 2 percent of an eye, so a complex eye could never evolve by means of natural selection since it could never even get started in the first place.”

But 2 percent of an eye is much better than no eye at all. With 2 percent of an eye, you could probably detect the shadow of a predator moving overhead and quickly dodge a lethal attack. For example, even some bacteria are capable of phototaxis, meaning that they can move towards or away from light with the use of a molecular “eye” within their tiny bodies. I am now 59 years old, and recently I experienced having a 2 percent eye when I had a posterior vitreous detachment (PVD) in my right eye. This is a usually benign condition that occurs in about 75% of people as they approach their golden years. The human eye is filled with a Jello-like vitreous humor that is attached to the retina. With age, the vitreous humor begins to shrink and pull away from the retina like Jello pulling away from the edges of a bowl. As the vitreous humor slowly collapses, it can gently pull on the retina inducing a perceived flash of light. In my case, when the PVD occurred, I saw flashes of light when I shifted my head, and my right eye fogged up like somebody was smoking inside of my eyeball. Thankfully, the smoke quickly cleared and within three weeks my eye was totally back to normal. Now the funny thing is that about a week prior to my PVD, on two occasions I found myself suddenly flinching and ducking in an involuntary manner while on my evening walks around the neighborhood. In both cases, I had the distinct feeling that I was under attack by a bird or a bat from overhead, so I involuntarily ducked, and then I felt very silly because there was obviously no bird or bat to be seen. So although I did not “really” see anything at the time, I believe that the onset of my PVD was beginning to stimulate my retina as my vitreous humor was about to give way, causing me to flinch uncontrollably for some unknown reason in the process. By the way, if you experience the symptoms of a PVD, you should immediately see an ophthalmologist to have your retina checked for tears that could possibly lead to a detached retina and resulting blindness in the affected eye if left untreated.

So 2 percent of an eye would be a useful thing indeed and could easily lead to the development of a very complex eye through small incremental changes that always made improvements to the incipient eye. Visible photons have an energy of 1 – 3 eV which is about the energy of most chemical reactions. Consequently, visible photons are great for stimulating chemical reactions, like the reactions in chlorophyll that turn the energy in visible photons into chemical energy stored in carbohydrates, or in other light-sensitive molecules that form the basis for sight. In many creatures, the eye simply begins as a flat eyespot of photosensitive cells that look like a patch somewhere along their body that looks something like this: |. In the next step, the eyespot forms a slight depression, like the beginnings of the letter C, which allows the creature to have some sense of image directionality because the light from a distant source will hit different sections of the photosensitive cells on the back part of the C. As the depression deepens and the hole in the C gets smaller, the incipient eye begins to behave like a pin hole camera that forms a clearer, but dimmer, image on the back part of the C. Next a transparent covering covers over the hole in the pin hole camera to provide some protection for the sensitive cells at the back of the eye and a transparent humor fills the eye to keep its shape: C). Eventually, the transparent covering thickens into a flexible lens under the protective covering that can be used to focus light and to allow for a wider entry hole that provides a brighter image, essentially decreasing the f-stop of the eye like in a camera: C0). Computer simulations have shown that a camera-like eye can evolve in as little as 500,000 generations, which equates to perhaps a million years or less (see Figure 2).

Figure 2 – Computer simulations of the evolution of a camera-like eye(click to enlarge)

Now the concept of the eye has independently evolved at least 40 different times in the past 600 million years, so there are many examples of “living fossils” showing the evolutionary path. In Figure 3 below, we see that all of the steps in the computer simulation of Figure 2 can be found today in various mollusks. Notice that the human-like eye on the far right is really that of an octopus, not a human, again demonstrating the power of natural selection to converge upon identical solutions by organisms with separate lines of descent.

Figure 3 – There are many living fossils that have left behind signposts along the trail to the modern camera-like eye. Notice that the human-like eye on the far right is really that of an octopus (click to enlarge.

So it is easy to see how a 2 percent eye could easily evolve into a modern complex eye through small incremental changes that always improve the visual acuity of the eye. But how could a 2 percent wing be of any survival advantage at all? You need something that provides some sort of survival advantage to get things started on the road to a fully functional wing. Gould uses the concepts of exaptations and spandrels to explain how it could happen. He describes how the research of others, using models of insects composed of wire and epoxy resin, has shown that for insects a 2 percent proto-wing does not help the insect at all with gliding or landing on its feet from a fall. It turns out that a 2 percent insect proto-wing serves no aerodynamic purpose whatsoever, so a 2 percent proto-wing would not be a good starting point, from an aerodynamic perspective to kick-start the evolution to a fully functional wing. However, Gould also shows that the research of others has shown that a 2 percent wing could be used as a good radiator fin for the thermal regulation of an insect that would allow the insect to either cool off or heat up as needed. As the radiator-fin-proto-wing grows in size, it becomes an ever better radiator fin, so the Darwinian forces of innovation and natural selection could easily lead to proto-wings of ever-increasing size. However, research on models also shows that there comes a point of decreasing returns for wing size from a thermoregulation point of view, so eventually, there is no selective advantage in enlarging an insect’s wing beyond a certain size. But at the same time, research on models also shows that, as wings get larger and larger, they finally become more aerodynamically proficient, creating a selective advantage for gliding to a safe landing on an insect’s feet after a fall. This is truly an example of a screwdriver evolving into a wood chisel!

Limitations Imposed by Historical Biological Constraints
Gould also believes that the evolution of life on Earth has also been greatly affected by constraints imposed by historical precedent. For example, in the essay Hooking Leviathan by Its Past, he points out that nearly all fishes move through the water by waving their tail fins back and forth horizontally. But whales, as mammals returning to the sea, do just the opposite. Whales move through the water by waving their tail fins up and down vertically. Gould points out that this is a holdover from a whale’s mammalian body design. Picture in your mind the undulations of the spinal column of a cheetah running down its prey, and you can easily see a whale undulating its tail fin up and down through the water.

Evolution Is Not Progressive And Is Not Predetermined
Gould is also famous for contending that if you “rewind the tape of life” back to the very beginning and let it run forward again, you will always get a completely different biosphere every time because of the chaos induced by mass extinctions caused by incoming comets and asteroids or an overabundance of greenhouse gasses in the atmosphere, the random nature of exaptations and biological spandrels and the limitations imposed by historical biological constraints. Gould explains that if you plot the biosphere versus complexity, you will see something like Figure 4 below.

Figure 4 - The Wall of Minimal Complexity (click to enlarge)

The bulk of the biosphere prior to the Cambrian Explosion was composed of simple single-celled prokaryotic bacteria. Bacteria run with the minimum architecture necessary to get by as living things. This gives them the ability to live in very extreme and hostile environments, and to subsist on just about any form of available energy. Complex multicellular life just cannot do that. Complex life needs a narrow and stable temperature range in which to exist, and it has very finicky dietary requirements for what it can eat and drink. Complex life just cannot sit down to a hearty dinner of hydrogen sulfide gas dissolved in water, like a can of smelly soda pop, as some bacteria can do. But even after the Cambrian Explosion, we still find that the bulk of life on Earth is still comprised of simple bacteria. In a sense, complex life is just to be found in the round-off error of the biosphere. For Gould this round-off error of complex life is like a gas slowly diffusing away from what he calls the “Wall of Minimal Complexity”. Life cannot diffuse to the left of the Wall of Minimal Complexity because then it would die, but it can diffuse slightly to the right to some extent. This model is somewhat like the behavior of the air molecules of the Earth’s atmosphere. Most air molecules find themselves down near the surface of the Earth and cannot diffuse very far into the solid Earth, which acts like a Wall of Minimal Complexity, but they can rise above the Earth’s surface. As you ascend in altitude, the number of molecules steadily decreases, until you get several hundred miles up where they are still present, but quite rare. The very few air molecules that do attain an altitude of several hundred miles do so in a very erratic and perilous manner, subject to the random whims of the Universe. True, their inherent kinetic energy did get them all the way up there, just as natural selection will guide the way for the evolution of complex life, but the path along the way will always be different and very unpredictable for each molecule. Similarly, there will always be a number of complex species diffusing away from the Wall of Minimal Complexity, but how that complex life will look is impossible to predict because of the unpredictable and erratic course of evolution. If true, this does not bode well for the emergence of software in our Universe. If the evolution of intelligent carbon-based life is a rare thing even on our Rare Earth, then there cannot be that much software out there either.

Gould’s thoughts are in stark contrast to the idea of evolutionary convergence that I discussed in A Proposal For All Practicing Paleontologists. Convergence maintains that since there are only a limited number of ways of doing things that work, complex life tends to reinvent itself over and over again, and keeps coming up with the same basic designs in independent lines of descent. That is why insects, birds, bats, and some dinosaurs all came up with the same basic architecture for a wing. Convergence would predict that the evolution of intelligent carbon-based life would be much more likely since there is a definite survival advantage to having a large neural network that can better perceive predators and prey. Eventually, these neural networks get so large that intelligent consciousness emerges, and then things really take off because intelligence is the ultimate spandrel of them all that can be used for nearly an infinite number of things that enhance survivability. The concept of convergence relies heavily upon the apparent overwhelming power of natural selection to overcome all obstacles -“mutation proposes, but natural selection disposes” and natural selection wins in all cases.

In The Spandrels of San Marco and the Panglossian Paradigm, Gould describes this obsession with natural selection as an outgrowth of what he calls the Adaptationist Program. The Adaptationist Program maintains that natural selection is so powerful that it simply dwarfs all other factors, and consequently, leads to the convergence of body designs. Many of my favorite evolutionary biologists such as John Maynard Smith, Richard Dawkins, Simon Conway Morris, the geologist Mark McMenamin, and the philosopher Daniel Dennett, seem to fall into this camp. In the essay More Things in Heaven and Earth, Gould casts this lot as members of a cult-like group obsessed with natural selection that he characterizes as a type of “Darwinian fundamentalism” of near-religious fervor. He begins the essay with a quote from Darwin’s last edition of On the Origin of Species (1872) before launching into an attack upon the adaptationist program and offering up a pluralistic approach to evolutionary theory that relies upon an assortment of forces shaping the evolutionary history of life on Earth as I described above.

As my conclusions have lately been much misrepresented, and it has been stated that I attribute the modification of species exclusively to natural selection, I may be permitted to remark that in the first edition of this work and subsequently, I place in a most conspicuous position – namely at the close of the Introduction – the following words: “I am convinced that natural selection has been the main but not the exclusive means of modification.” This has been of no avail. Great is the power of steady misrepresentation.

Social Darwinism and Herbert Spencer
The final sections of The Richness of Life- the Essential Stephen Jay Gould, feature some of the political writings of Gould in reference to the uses and abuses of Darwinian thought in politics. Along these lines, in support of his contention that evolution is not inherently progressive in nature, Gould likes to point out that the terms “evolution” and “survival of the fittest” did not actually originate with Darwin, but with the English philosopher Herbert Spencer (1820 – 1903). Herbert Spencer was the epitome of the Victorian times, obsessed with the progress made by Victorian society as a result of the industrialization of Britain and of the expansion of the British Empire. Originally, Darwin liked to call his theory “descent with modification”, but Spencer preferred the term “evolution” because it conveyed the idea of progress from the Latin evolutio or “unfolding”, and Spencer’s terminology prevailed. Spencer contended that although the excesses of 19th-century capitalism, brought on by the rapid industrialization of Europe and the United States, might have seemed a bit cruel, governments should not interfere with the “survival of the fittest” by legislating against such things as child labor, incredibly unsafe factories, and tainted processed foods and drugs, or to provide for such things as the relief of poverty, public education, public sanitation systems for safe drinking water, public sewage systems, and anything else that might help alleviate the plight of the “undeserving poor”. Spencer’s ideas were soon adopted by those benefiting the most from the extreme concentration of wealth brought on by rapid industrialization in the late 19th century in the form of the theory of Social Darwinism. Social Darwinism held that governments should not interfere with the natural order of things for the long-term good of the human race, and provided a scientific theory that helped to soothe the conscience of the very rich, since providing for the very poor could now be deemed an abomination against the laws of Nature.

Before moving on and examining Gould’s ideas from an IT perspective, I would like to make a slight digression because the resurgence of Social Darwinism, in the form of the recent appearance of the Tea Party movement in the United States, seems to be both a very good example of the concepts of punctuated equilibrium and of biological convergence at work. As I have mentioned in the past, the real world of human affairs is all about the peculiarities of self-replicating information. There are now three forms of self-replicating information on this planet – genes, memes, and software that are all battling it out for dominance, with software rapidly gaining the upper hand. Since all forms of self-replicating information share many common characteristics, much can be learned by studying one to learn about the others, so let us briefly examine the emergence of the Tea Party as a new meme-complex, since it might shed some light on the evolution of software as well.

We are living in very divisive times in the United States and once again the election season is at hand. Sadly, the only thing that all Americans can now seem to agree upon is that something is dreadfully wrong with America, and we all have our own deeply held beliefs on how to address the problem. I find the recent Tea Party movement to be an interesting resurgence of the Social Darwinism meme-complex of the late 19th century. The Tea Party movement seems to want to rollback all of the reforms made to capitalism in the early 20th century by the Progressive Era (1890 – 1921), under the presidencies of Teddy Roosevelt, Taft, and Wilson that gave us such things as the Interstate Commerce Act of 1887, the Sherman Antitrust Act of 1890, the Meat Inspection Act of 1906, the Pure Food and Drug Act of 1906, the Federal Reserve System (1913), and the Federal Income Tax (1913), plus later social legislation like Social Security (1935), Medicare (1965), Medicaid (1965), the Civil Rights Act of 1964, and the Americans with Disabilities Act of 1990. Personally, as an 18th-century liberal and 20th-century conservative, I have no desire to return to the 19th century. We did that once already, and it was not very nice, but the Tea Party disagrees.

The rapid appearance of the Tea Party movement on the political scene, seemingly arising out of nothing, is a good example of punctuated equilibrium at work between the Republican and Democrat Party meme-complexes. Both parties have been battling it out for over 150 years, and during that time have usually reached a stable state of equilibrium between predator and prey, with neither party causing the extinction of the other. Just as genes come together to create DNA survival machines that enhance the survival of DNA, meme-complexes are composed of memes that come together for their own joint survival too. Again, the key impetus for self-replicating information is survival itself, and in order to survive, meme-complexes must sometimes adapt to new conditions on the ground by adopting new memes and discarding old detrimental memes of the past too. That is why both parties are found to slowly evolve over time, and sometimes even switch sides on issues! For example, the Republican Party started out as a liberal anti-slavery party in the 19th century, with the Democrats holding the conservative pro-slavery position. This continued on until the 1960s when the two parties switched sides, leaving the Democrats with a pro-civil rights position and the Republicans tending to resist the civil rights movement. Similarly, during the late 19th century and on into the Progressive Era (1890 – 1921), the “Bourbon” Democrats were the pro-business party, while the Republicans under Teddy Roosevelt and Taft were the anti-business “trust-busters” of the day, pushing through regulatory restraints on capitalism. If you look closely, President Obama’s policies are strangely reminiscent of Teddy Roosevelt’s “Square Deal” with a 21st-century twist. Thus in the 19th-century Democrats were conservatives and Republicans were liberals, and in the 20th century they both slowly switched positions, so that for most of the 20th century Democrats were liberals and Republicans were conservatives! Seemingly, the only meme within each party that went the distance is that Democrats oppose Republicans and Republicans oppose Democrats on all issues, whatever they might be at the time. This dynamic created a very long period of stasis for both parties in the 20th century, during which their political positions did not change a great deal, in keeping with Gould’s concept of punctuated equilibrium, which holds that species are usually very stable and in equilibrium with their environment and only rarely change when required.

This all began to change in the early 21st century when the punctuated equilibrium arrival of the Tea Party disrupted the long-standing stasis between the Republican and Democrat parties. The rapid emergence of the Tea Party upon the political scene is reminiscent of the rapid appearance of a new species in the fossil record with no apparent intermediate forms to be found. Upon closer inspection, the Tea Party meme-complex is actually composed of some memes that were floating around in the Republican Party meme-complex for some time and which finally branched off into a new party of its own to address the duress of the severe economic downturn of the present times. So the Tea Party really did indeed evolve from the Republican Party through small incremental changes. It is just hard to pinpoint exactly when and where it first appeared.

The Tea Party is also an example of convergence, the reinvention of similar solutions to similar problems in the biosphere. The economic turmoil caused by rapid industrialization in the 19th century gave birth to Social Darwinism as a means to justify the excesses of 19th-century capitalism, just as the economic turmoil brought on by globalization and the bursting of the real estate bubble in 2008, gave birth to the Tea Party movement. Both Social Darwinism and the Tea Party movement promote similar policies of removing government interference and regulation from all economic activities within the United States and beyond.

Again, to my mind, all this controversy stems from a fundamental lack of understanding of the nature of self-replicating information. For the Tea Party movement, this is further complicated by the fact that many Tea Party members do not have much confidence in Darwinian thought in the first place and wish to have it banned from public schools! That is a shame because Darwinian thought is core to understanding the nature of self-replicating information. You would think that it would be very difficult for a meme-complex to maintain a passionate Darwinian “survival of the fittest” approach to capitalism, while at the same time advocating the banning of Darwinian thought in schools, but such is the case.

It is important to remember that self-replicating information is just mindless information bent on replicating at all costs and that it is not necessarily working in our best interests. As Gould has pointed out, there is nothing sacred about natural selection or “survival of the fittest”. The chief advantage of a Darwinian system of economics, like capitalism, is that you do not need a designer. The failure of socialism and communism in the 20th century attests to the difficulty of trying to design a complex modern economy – it just cannot be done by the human mind. It is much better to just let economies design themselves through the Darwinian mechanisms of innovation and natural selection found in capitalism, as Adam Smith pointed out in The Wealth of Nations (1776). Darwin actually based his theory on Adam Smith’s “Invisible Hand” after reading The Wealth of Nations and doing some fieldwork on board the HMS Beagle. So I am a strong advocate of capitalism because it is a Darwinian system of economics that is proven to work since it is based upon the same operational processes that make the biosphere work. However, there are some downsides to this approach as well.

First of all, Darwinian systems like capitalism do not necessarily yield the most productive of all systems. As Gould pointed out, natural selection has no teleological intent to drive a system to an ultimate state of perfection. The beauty of capitalism is that it does not need a designer to produce an incredibly complex economy, but natural selection is subject to the expediency of the moment and is the ultimate short-term thinker, only choosing the survivor of the moment with no thought whatsoever of the future. Consequently, many aberrations can arise in capitalism that an outside designer can easily identify. For example, inThe Greatest Show on Earth (2010) Richard Dawkins asks why are trees 100 feet tall instead of 10 feet tall? It takes a lot of mass and energy to build a 100-foot trunk to hold the leaves that gather sunlight. A 10-foot trunk with widely spreading branches would do the job just as well, and a 100-foot trunk is made mostly of cellulose, a tough substance that not even termites can digest – the bacteria in their guts do that for them. The reason that trees are 100 feet tall is that they grow in forests and compete for sunlight with other trees. A 10 foot tall tree species in a forest would be quickly shaded into extinction. So trees grow as tall as possible until other factors enter into the calculation of survival and limit their growth. A 1,000-foot tree would have no competitors at all until it blew over in a gentle breeze. So a forest does indeed work, Darwinian “survival of the fittest” guarantees at least that much, but it clearly is not the most efficient way of collecting sunlight. About 10,000 years ago we learned that by cutting down the trees and planting the exposed ground with domesticated seeds, we could dramatically boost the economic productivity of the biosphere by artificially changing the rules under which “survival of the fittest” operated. Similarly, over the past 100 years, we have done the same thing with the rules under which capitalism operates, in a manner to allow capitalism to work its miracles in a manner useful to mankind.

Secondly, nobody really wants to live under a purely Darwinian form of capitalism, with a totally unfettered “survival of the fittest” guiding principle. At any given time, there are always regions of the world where governments have totally withered away, leaving a number of warlords running things. Yes, pure laissez-faire capitalism does continue to produce some economic activity under such conditions, but at a very low level of output because all the theft, bribery, and extortion severely limits the incentive to work hard for one’s own economic benefit since it can be easily absconded off with. It is much better to have a government in position that somewhat limits total freedom by enforcing property rights, the provisions of contracts, zoning laws, and which provides for police protection. Therefore, in order to maximize economic output, it is necessary to “domesticate” capitalism in some sense through law and regulation, just as we domesticated certain wild elements of the biosphere to produce crops and livestock. I think all Americans can agree upon that. It is just the matter of degree that we all squabble about. The history of the United States has shown that at times we have had too much regulation and at other times too little. Remember, a vice is simply a virtue carried to an extreme.

So I would recommend that all Americans try to calmly sit down and think things through in a rational manner. We need to recognize that all segments of the American society are needed to make it all work. We need the rich, the poor, and the middle class. If you look to the biosphere or to the human experience of everyday economic life, you will always find parasites and freeloaders, scammers and chiselers, and some of them will be rich and some of them will be poor, but most of us are decent folks just trying to make a go of it. The rich will complain that essentially they pay all of the income tax, which is true, while the poor will complain that the rich now get nearly all of the money, which is also true. Tax rates should not be so high as to stifle the incentive to study, work hard, or to take an entrepreneurial risk. But at the same time, we need to recognize that capitalism tends to concentrate wealth into the hands of a very small percentage of the population. That is why the Federal Income Tax was established in 1913 as a means to redistribute wealth. Prior to the Federal Income Tax, the federal government was financed primarily by tariffs and fees that were more of a burden for the poor than for the rich. As an 18th-century liberal and 20th-century conservative, I would like to see us return to the days of President Eisenhower with a growing middle class, rather than the experience of the past 30 years where we have seen the rich get richer and the poor get poorer. It is not to anybody’s long-term interests to see the middle class of America disappear.

We must remember that there is nothing sacred about the way natural selection distributes wealth in a capitalistic economy since it is the product of mindless self-replicating information without a designer. To some extent, it does so based upon rewarding performance, but not entirely. Capitalism works because it rewards ambition, initiative, and hard work, but to my mind, most of the wealth in the modern world has not actually come from the ambition, initiative, and hard work of its current inhabitants. People have been working hard for over 200,000 years, and for most of that time, they lived in utter poverty. Most of today’s wealth has actually come from the ambition, initiative and hard work of scientists and engineers living in the 18th, 19th, and 20th centuries who made very little at the time. Although investment bankers and hedge fund managers may make princely sums, it seems to me that their compensation may not be commensurate with their actual contributions to the national economy and is an aberration caused by some of the imperfections of capitalism. Imagine the wealth we would have today if we had plowed similar sums into research and development for the past 200 years! Although I cannot exactly explain why, deep down I have a disturbing feeling that, despite all the ranting and raving between the Republicans and Democrats, most of our economic problems stem from America having given up on science and rational thought about 30 years ago.

Gould From an IT Perspective
So do we see such things as punctuated equilibrium, exaptations and spandrels, and the limitations imposed by historical constraints in the evolutionary history of software, and more importantly, has the evolution of software been progressive in nature over the past 70 years or not? I would say for the most part that we do see such things, but on the other hand, I would also contend that natural selection does seem to have been the dominant factor in shaping the evolution of software, causing software to essentially follow the same architectural path as life on Earth did through a process of convergence. So I still have high hopes for intelligent carbon-based life in our Universe and the emergence of complex software too.

It does seem as if the evolution of software has been somewhat progressive in nature in an almost Spencerian sense in contradiction to the views held by Stephen Jay Gould. Yes, the bulk of software probably does consist of simple unstructured .bat files, Unix shell scripts and edit macros desperately clinging to the Wall of Minimal Complexity like a bacterial scum. But the software that sticks out in one’s mind has certainly increased in complexity and functionality over the years. As Gould pointed out, perhaps we are just focusing on the complex software, rich in function, that is in the round-off error of all software combined, and that is why it appears as though software has progressed in complexity. One might also argue that software will certainly progress in complexity and functionality because we humans are constantly trying to make software better, so the analogy to the lack of a progressive nature for evolution in the biosphere naturally fails because living things do not “try” to evolve to increased levels of complexity, while we humans are always “trying” to make software better.

However, softwarephysics would counter that the analogy does hold because genes and software are both forms of self-replicating information bent on survival. As Richard Dawkins pointed out, our bodies do not use genes to build and maintain themselves. On the contrary, our genes use our bodies as disposable DNA survival machines to protect and replicate genes down through the generations. Similarly, software is a form of self-replicating information that has formed very strong parasitic/symbiotic relationships with nearly every meme-complex on the planet, and in doing so, has domesticated our minds into churning out ever more software of ever more complexity. Just as genes are in a constant battle with other genes for survival, and memes battle other memes for space in human minds, software is also in a constant battle with other software for disk space and memory addresses. Natural selection favors complex software with increased functionality, throughput, and reliability, so software naturally progresses to greater levels of complexity over time. With that said, let us next examine some of Gould’s ideas that do seem to apply to the evolution of software.

Punctuated Equilibrium in IT
When dealing with the daily mayhem of life in IT, it is hard for IT professionals to take in the grand scheme of it all because we are basically just trying to survive through the day. But when I look back over the past 30 years of my career as an IT professional, I do see punctuated equilibrium at work. There are long periods of many years of stasis in software evolution when nothing much seems to change at all, and then all of a sudden there are dramatic changes, seemingly coming out of nowhere. So software does seem to evolve in steps rather than slowly strolling up a gently rising ramp. I think all IT professionals will certainly have their own stories to tell in support of this observation. So rather than trying to generalize it, let me relate my own.

Having learned how to write unstructured batch FORTRAN programs in 1972 on punched cards, the first dramatic evolutionary step I saw unfold was when I ran across my first structured FORTRAN program in 1975. As I pointed out in SoftwareBiology, the advent of structured programming in the evolution of software was equivalent to the rise of eukaryotic single-celled life on Earth, and like the eukaryotic architecture with its subdivision of functions into organelles, all complex software that has followed has continued on with the elements of structured programming. When I saw that first structured program, I did not know what to make of all the comment cards, indented code and subroutines, so I rewrote the whole thing with each line of code starting in column 7 of my punch cards as it should! Programming on cards was one of the things that slowed the emergence of structured programming in the 1970s. Since it cost a lot of money to compile a program and obtain a line printer listing of the code, we normally just programmed by flipping through the card deck, replacing cards that needed changing by punching them up on an IBM 029 keypunch machine and then reinserting the new cards back into the deck. Reading indented code one line at a time on punch cards really is of no value because you cannot appreciate the indented logic, and it is very hard to keypunch indented code because you have to carefully count all the leading blank spaces or the code gets very ragged on the left margin. One mistaken keystroke and you have to eject the card and start all over again. So it was much easier to simply punch all the cards so that they all started in the same column on the card. Similarly, branching off to a subroutine in a punch card deck is a pain because it is very hard to find the subroutine in the deck. It was much easier to simply bracket the code with some labeled cards and then use GOTO statements to branch into and out of the code via the labels. This all rapidly changed in the early 1980s when we started programming on IBM 3278 terminals using full-screen text editors like ISPF. Now you could see a full screen of code all at once, and indenting code helped to highlight the logic. Plus, it was now easy to find your subroutines with the Find function of the editor.

For me, the next evolutionary step was moving from batch programming on mainframes to interactive programming on the IBM VM/CMS operating system, which was somewhat similar to the Unix operating system. At first, we wrote simple menu-driven applications which simply printed a list of numbered options on the screen for the user to choose from in order to navigate through the application. After a few years, menu-driven applications were replaced by screen-oriented applications using DMS or ISPF Dialog Manager. These applications were somewhat like the online CICS applications that had been running on the mainframes since 1968, but they were interactive in nature rather than just online applications pulling up and modifying somebody’s account information. These interactive applications actually performed computations in real time for the user based upon screen input. I spent most of the 1980s writing such screen-oriented interactive applications. For me, another period of long stasis and stability in the evolution of software.

The next step was the arrival of computer science graduates from the recently formed computer science departments at major universities. In the early 1980s, most programmers were still fugitives from the sciences or former mathematicians, with a few escapee accountants thrown in for good measure. At first, the computer science graduates were all mainframe COBOL/CICS programmers, but in the late 1980s, they suddenly switched to being Unix and C programmers because running Unix on servers was a cheaper way for the universities to teach computer science. When I saw my first C program, I thought that it was the ugliest computer language that I had ever seen, with its mixed case code and squirrely brackets “{ }” all over the place. I just knew that C would never really catch on with such ugly syntax, but I taught myself C and Unix just the same. Getting used to mixed case code was very difficult for me because, up until then, I had only been coding uppercase FORTRAN, COBOL, PL/1 and REXX. This was an historical constraint held over from the punched card days. WE LEARNED BACK THEN THAT ALL CODE SHOULD ONLY BE UPPERCASE BECAUSE THEN YOU COULD STILL READ THE CODE ON THE PUNCHED CARDS WHEN THE PRINTER RIBBON ON YOUR IBM 029 KEYPUNCH MACHINE GOT WORN OUT.

In the early 1990s the distributed computing revolution hit with full force, ending my 1980s stasis of programming screen-oriented applications on VM/CMS, and then it really paid off to have learned Unix and C in advance – sort of a spandrel turning into an exaptation for me. Suddenly, mainframe COBOL/CICS programming was out too and writing your applications on cheap Unix servers in C was the new new thing. Object-Oriented programming also began to go mainstream at this time in the form of C++ programming, so I taught myself C++ which was not too difficult since C++ had evolved from C and had carried forward its dreadfully ugly syntax. As I pointed out in SoftwareBiology, object-oriented programming is equivalent to multicellular organization, where applications consist of instances of objects (cells) that are created, used, and then destroyed.

In 1995 the Internet hit and this time we all learned HTML and started working with webservers and browsers and eventually struggled with ways to serve up dynamic HTML using the Common Gateway Interface (CGI) and Pearl scripts or C programs. While the web-based application revolution proceeded on, I got shanghaied into Amoco’s Y2K project from 1997 – 1999, so it was back to mainframe COBOL, FORTRAN, and PL/1 for me.

After leaving Amoco in 1999, I ended up doing Tuxedo middleware at United Airlines in support of http://www.united.com/. In 2003 I moved to the Middleware Operations group of my present employer supporting their websites on Apache, Websphere, JBoss, Tomcat, and ColdFusion, with nearly all code now written in Java. Java inherited the squirrely C/C++ syntax, so I guess I was wrong about the staying power of C after all. We are currently getting very heavily involved with the SOA – Service Oriented Architecture revolution using J2EE Appservers like Websphere and JBoss to create dynamic HTML. In SoftwareBiology, I pointed out that SOA is equivalent to the Cambrian Explosion in the history of life on Earth. In SOA we have consuming objects (cells) running in consumer Appserver JVMs making service calls to service objects (cells) running in service Appserver JVMs. Although we keep adding more and more applications and JVMs to the mix, not much has changed from an architectural standpoint since SOA hit, so once again I seem to be entering another period of stasis.

Exaptations and Spandrels in IT
There are many examples of these in the evolution of software, but probably the most significant is that of the evolution of the World Wide Web. The current Internet evolved from the ARPANET of the 1960s and 1970s. The ARPANET was a packet-switched network of computer nodes conceived of by the Defense Department’s Advanced Research Projects Agency or ARPA in 1968. It was designed to survive a nuclear strike by not having a single point of failure that could disrupt communications between computer nodes on the network. Packet-switching was used to simply route packets around any node on the network that happened to fail. The first ARPANET node was installed at UCLA in 1969. Subsequent nodes were established at Stanford, the University of Utah, and the University of California in Santa Barbara in 1969 as well. By 1984 there were 1,000 nodes on the ARPANET, and by 1989 the number had grown to 100,000 nodes, primarily at universities and research centers. Today, the Internet has grown to billions of nodes.

In 1980 Tim Berners-Lee began working at CERN, the European particle accelerator complex near Geneva Switzerland, as a consultant. Frustrated with trying to locate information on the large number of computers at CERN, Berners-Lee submitted a proposal in 1989 to CERN IT management to create a World-Wide Web: An Information Infrastructure for High-Energy Physics that would use browsers and webservers to connect the particle researchers of the world over the existing networks. Naturally, the proposal was at first rejected by IT management, but Berners-Lee persisted and eventually, it was approved. Berners-Lee’s team came up with the ideas of the Hypertext Transfer Protocol (HTTP), Hyper Text Markup Language (HTML), and the Universal Resource Locator (URL).

In the late 1970s Bill Joy wanted to rewrite the Unix operating system in a language other than C. In the 1980s he tried C++ at first, but then decided a better programming language was needed that did not have all the messy pointer arithmetic that invariably led to the bugs and memory leaks of C and C++. In 1991 Bill Joy joined several others at Sun on their Stealth Project to develop software for smart electronic consumer products, like toasters, that could do processing on a large, distributed, heterogeneous network of consumer electronic devices all talking to each other. James Gosling was a fellow member of the Stealth Project who was assigned the task of finding the appropriate programming language for the project. Gosling began with C++ too but quickly realized its shortcomings as had Bill Joy. The new programming language would have to run on a very diverse set of hardware, and it would be nearly impossible to do that with a compiled language that had to deal with the varied instruction sets of all that varied hardware, so it was decided to use an interpretive language that could be semi-compiled into a “byte-code” that could be run in a “virtual machine” on the toasters and other such products. His first attempt was a language called Oak. Gosling realized that if Oak was to take on the consumer electronics market by storm, it would need to be easy to learn, so he took the strange syntax of C and C++ with its squirrely brackets { } as a starting point, since most programmers were already familiar with C and C++. Again, this is an example of software evolution being limited by historical constraints. Also, remember that C++ evolved from C by adding classes to it. Gosling realized that nobody wanted to reboot their toaster every day just to make a quick slice of toast, so he discarded the pesky error-prone pointers of C and C++ too. He also got rid of multiple-inheritance of objects and operator overloading to minimize the creation of bugs caused by programmers prone to writing tricky code. The introduction of automatic garbage collection was also introduced in the hopes of plugging the memory leaks of C++ where you had to destroy your own objects. Unfortunately, after a quick patent search, it was learned that there already was a programming language called “Oak”! Luckily, after a visit to a local coffee shop by the development team, Oak was rechristened as Java!

Unfortunately, toasters were just not ready for Java in 1992. Sun tried to experiment with Java on interactive TVs, but that did not work out either. To quote the Virginia Tech Computer Science department at:

http://ei.cs.vt.edu/book/chap1/java_hist.html

In June of 1994, Bill Joy started the "Liveoak" project with the stated objective of building a "big small operating" system. In July of 1994, the project "clicked" into place. Naughton gets the idea of putting "Liveoak" to work on the Internet while he was playing with writing a web browser over a long weekend. Just the kind of thing you'd want to do with your weekend! This was the turning point for Java.

The world wide web, by nature, had requirements such as reliability, security, and architecture independence which were fully compatible with Java's design parameters. A perfect match had been found. By September of 1994, Naughton and Jonathan Payne (a Sun engineer) start writing "WebRunner," a Java-based web browser which was later renamed "HotJava." By October 1994, HotJava is stable and demonstrated to Sun executives. This time, Java's potential, in the context of the world wide web, is recognized and the project is supported. Although designed with a different objective in mind, Java found a perfect match on the World Wide Web. Many of Java's original design criteria such as platform independence, security, and reliability were directly applicable to the World Wide Web as well. Introduction of Java marked a new era in the history of the web.


Now that is certainly an example of a spandrel becoming an exaptation that eventually evolved into something completely different – truly a screwdriver becoming a wood chisel to be used by all! So the World Wide Web evolved from a cold war computer network meant to survive a nuclear strike, running on software meant to help particle physicists complete the Standard Model and written with a Java programming language meant to run on toasters! While reading about punctuated equilibrium, spandrels and exaptations and how they might have contributed to the evolution of wings and everything else in The Richness of Life- the Essential Stephen Jay Gould, I simply could not help thinking back to those Flying Toasters made famous by After Dark on the Mac in 1989 and still available today on YouTube:

http://www.youtube.com/watch?v=Gwn59R8Mdps

Comments are welcome at scj333@sbcglobal.net

To see all posts on softwarephysics in reverse order go to:
https://softwarephysics.blogspot.com/

Regards,
Steve Johnston