Monday, February 20, 2012

The Limitations of Darwinian Systems

In many of the postings in this blog on softwarephysics, we have seen the wonderful things that the Darwinian processes of inheritance, innovation and natural selection can do. The beauty of Darwinian systems is that they can seemingly navigate through Daniel Dennett’s complex Design Space to seemingly impossible optimal designs, and without the aid of any directing intelligence at all. This is what sets Darwinian systems apart from all other systems in the Universe. Non-Darwinian systems tend to succumb to the second law of thermodynamics, and therefore, degrade to a state of maximum entropy or disorder, while Darwinian systems, on the other hand, seemingly defy the second law of thermodynamics altogether by producing complex low-entropy designs all on their own. This is further bolstered by the numerous examples of convergence that are found within the biosphere, where species from different lines of descent magically converge upon the same seemingly optimal engineering designs, like the camera-like eye found in humans, birds, sharks and octopuses.

However, Darwinian systems do have their limitations. As Richard Dawkins pointed out in Climbing Mount Improbable (1996), natural selection will always favor characteristics that yield enhanced levels of survivability over characteristics that yield decreased levels of survivability as a species ascends Mount Improbable. Consequently, species do not follow a path through Daniel Dennett’s Design Space that traverses a trail with lower survivability in order to eventually cross over to a path with an even higher level of survivability. In Richard Dawkins’ metaphor of Mount Improbable, a species that finds that it has climbed to a localized peak in the mountainous terrain of survivability is not allowed to descend from the localized peak to find its way up to the summit of Mount Improbable by a different route. Instead, the species will find itself stuck upon the localized peak in the survivability terrain because any mutation that leads away from the localized peak will necessarily lead to a lower level of survivability, and will consequently, be strongly selected against by natural selection. Furthermore, all species eventually do find themselves to be stuck upon a localized peak within the survivability terrain, and when the environment changes, they cannot escape by means of the Darwinian processes of inheritance, innovation and natural selection, and consequently, go extinct. This is why all species eventually do go extinct.

Figure 1 – Darwinian systems can find themselves trapped upon a localized peak in the survivability terrain once they have evolved to a localized peak because they cannot ascend any higher through small incremental changes. All paths lead to a lower level of survivability, and thus, will be strongly selected against by natural selection. Above we see a localized peak in the foreground with the summit of Mount Everest in the background (click to enlarge).

Recently, I have seen this same effect at work in an IT setting. Over the past 5 years, I have seen the number of Websphere servers running our external websites slowly grow from 4 very slow Unix servers running in one Websphere Cell to 34 much faster Unix Websphere servers running in 10 Websphere Cells. The number of Websphere JVM Clusters has also increased from 8 to 110 during this same period, and consequently, the number of applications running within these JVMs has also grown exponentially as well by a factor of at least 20. Like many IT organizations, we do our IT maintenance on the hardware, software, and application code running on our IT infrastructure at night, when usage is low and the potential for end-user impact is slim. We also flip traffic out of the Websphere Cells undergoing maintenance to twin Cells that carry the small amount of night traffic that we do have to minimize impact. The end result is that we only have a 5-hour maintenance window within which to work at night. This should all be quite familiar to readers in IT Operations or Application Development who participate in maintaining hardware, software, and applications since it is a quite common way of performing maintenance.

The problem is that, although our infrastructure has grown dramatically over the years, it did so very slowly through small incremental changes, the same way that living things evolve over time. In Is Self-Replicating Information Inherently Self-Destructive?, we saw that from Peter Ward’s point of view, as expressed in his book The Medea Hypothesis (2009), all living things resulting from Darwinian processes must necessarily be selected for the ability to self-replicate at all costs, with little regard for their fellow beings sharing the same resources of the planet, nor even for their own long-term survival. The urge to self-replicate at all costs necessarily leads to living things that eventually outstrip their resource base through positive feedback loops, until no resources are left. This is what we are now seeing in our change advisory board meetings run by Change Management. Because all of this vast IT infrastructure of hardware, software, and application code needs to be maintained, and there are only seven 5-hour change windows in a week, there is now a great deal of competition amongst competing projects for the dwindling resource of time. And since each project manager tends to consider their project as the most important, this leads to a great deal of conflict. The problem is that, although our IT infrastructure has grown exponentially over the past 5 years, we still only have a 5-hour change window each night within which to work, and we cannot buy any more time at any price. It is a fixed resource, with a fixed carrying capacity, like the planet Earth itself. But as the infrastructure continuously grows in size and complexity, we keep trying to squeeze more and more into this same 5-hour change window, leading to increased levels of stress on the part of the staff performing the maintenance, and for the potential for disastrous mistakes to be made by individuals performing changes under pressure and in haste. This intense competition during change advisory board meetings amongst competing projects for change window time, also leads to a great deal of inefficiency, because it leads to a great deal of project rework when projects are forced to be rescheduled to other change windows because we simply cannot squeeze them into the originally planned window. IT maintenance work can be very complicated, and an implementation plan can require the coordinated talents of many people on many different teams, so when a complex project has to be rescheduled, it necessarily requires a great deal of rework on the part of the project managers to re-coordinate all of the necessary resources.

Clearly, this is an unsustainable path through Daniel Dennett’s Design Space, as we have now found ourselves rapidly approaching a localized peak on Richard Dawkins’ Mount Improbable, with no way to climb any higher. What is needed is an entirely new approach to doing maintenance on the infrastructure, and we cannot simply evolve our way to that new approach through small incremental changes to the current processes. The Darwinian mechanisms of inheritance and innovation, honed by natural selection, simply cannot do that, so unless some external action is taken, our current change processes will eventually lead to a collapse. Instead, we need to step back and take a bird’s eye view of the survivability terrain, so that we can find another path that does lead to a higher level, and yields an entirely new way of performing maintenance on our IT infrastructure. However, this requires the active participation of IT management. Simply continuing on, doing what we have always done in the past, will eventually lead to an evolutionary dead-end and eventual extinction.

All forms of self-replicating information also suffer a similar problem because all forms of self-replicating information evolve by means of the Darwinian mechanisms of innovation and natural selection, and eventually, paint themselves into an ecological corner on a localized peak of Richard Dawkins’ Mount Improbable. This is also an important consideration in today’s divisive political landscape within the United States. We now have many citizens in one political faction that, although they have very little confidence in Darwinian thought, are nevertheless strident supporters of capitalism, a Darwinian system of economics. As an 18th-century liberal and 20th-century conservative, I am also a strong supporter of capitalism, but also being a follower of Darwin, I am also keenly aware of its limitations. For example, today’s complex system of medical care, comprised of a large network of doctors, hospitals, insurance companies, employers and healthcare consumers was not designed by anyone. Instead, it evolved on its own through small incremental changes over a period of more than 100 years. And for most of that period, it did a fine job of providing healthcare through the free market mechanisms of Darwinian inheritance and innovation honed by natural selection. However, in recent decades it has evolved itself to a localized peak on its climb to the summit of Mount Improbable. As the population aged and the demand for healthcare services rose, the ecological niche to which this Darwinian medical system had superbly adapted itself to changed, and the medical system found itself trapped upon a localized peak with nowhere to go. This very complex medical system of doctors, hospitals, insurance companies and healthcare consumers had adapted itself to the concept of employer-provided healthcare insurance, but with increasing costs and competition from overseas corporations that did not have to provide healthcare insurance to employees because they had nationalized healthcare systems, U.S. employers began to drop coverage of employees, and consequently, produced more and more citizens with no insurance at all. In response, doctors and hospitals began to shift the costs of the uninsured population to the insured population, causing the price of medical coverage for the insured to rise dramatically. This caused even more employers to drop healthcare insurance, expanding the population of the uninsured, and therefore, creating a positive feedback loop eating away at the population of insured people. Also, people who had lost coverage at work, and who had pre-existing conditions, found that they could not buy coverage at any price because of their pre-existing conditions, leading to even more high-cost patients showing up at emergency wards, waiting for very expensive free medical treatment. Treating people in emergency rooms is probably the most expensive and least efficient way to provide medical care, so this, in turn, strengthened the positive feedback loop that was eating away at the number of insured people, by raising the costs for all.

In response, the federal government finally stepped in and instituted Richard Nixon’s CHIP (Comprehensive Health Insurance Plan), first proposed by President Richard Nixon on February 6, 1974. For details on Richard Nixon’s CHIP in his own words see:

http://www.kaiserhealthnews.org/stories/2009/september/03/nixon-proposal.aspx

The one difference between the Patient Protection and Affordable Care Act of 2010 and Richard Nixon’s 1974 CHIP was that the CHIP did not have a mandate for citizens to purchase health insurance policies. True, employers were required to provide health insurance for full-time employees, and the CHIP also instituted pools of competing insurance companies for people who were not covered at work, and these insurance pools would also be forced to take on all comers, even those with pre-existing conditions, but people were not required to buy insurance policies from them. You see, a mandate was not really required back in 1974 because the Emergency Medical Treatment and Active Labor Act of 1986, that required hospitals to provide free emergency medical healthcare to anyone who happened to show up at an emergency room, had not been enacted, so back in 1974 there was still a strong incentive for people to take on the personal responsibility of purchasing healthcare insurance from the pools, if they were not already covered at work.

By the way, as an 18th-century liberal and 20th-century conservative, I contend that Richard Nixon has been unjustly shortchanged by history. After all, Richard Nixon gave us détente with the Soviet Union, opened the door to China, ended the war in Vietnam, created the Environmental Protection Agency (EPA), supported and signed the Clean Air Act of 1970, formed the Occupational Safety and Health Administration (OSHA), endorsed the Equal Rights Amendment, supported and signed the National Environmental Policy Act requiring environmental impact statements for Federal projects, gave the blind and disabled Social Security benefits, negotiated SALT I, the first comprehensive limitation pact signed by the United States and the Soviet Union, and the Anti-Ballistic Missile Treaty, which banned the development of systems designed to intercept incoming missiles. By today’s crazed standards, Richard Nixon was a flaming socialist. In reality, Richard Nixon was a strong advocate for the miracles that the Darwinian system of capitalism can produce, but at the same time, he was also keenly aware of its limitations too, and under his concept of a “New Federalism”, the role that the federal government could play in filling in the gaps.

Comments are welcome at scj333@sbcglobal.net

To see all posts on softwarephysics in reverse order go to:
https://softwarephysics.blogspot.com/

Regards,
Steve Johnston

No comments: