Agile Tips

#34-The Principle of the Useful Illusion Part 1

Scott L. Bain

This is kind of a fun one.  I start by pointing out that, regardless of appearances, computers are not really doing any of the things we think they are doing.  Examining this fact, and it is a fact, leads to some interesting and I think useful insights.  Next week, I'll drive this idea into practicality.

Computers cannot add.

This is not a flippant or cleverly misleading statement. It is a plain fact.  Computers, regardless of all outward appearances, cannot add two numbers together and determine their sum.  Computers do not know what numbers are, or what adding is, or anything else for that matter.  They certainly cannot add.

Clearly, they appear to do so – routinely and constantly and without fail.  That computers appear to add numbers together, and yet they do not, is an example of what I call "the principle of the useful illusion."

To understand the importance of this, we must first come to agree that computers cannot, in fact, add.
 
Computers cannot add in the same sense that hammers cannot build houses.  Hammers can be used to build houses, but we can easily understand that it is the carpenter who wields the hammer that is doing the building, not the hammer itself.

Hammers and computers are the same in this sense.  In fact, it is interesting to note that a computer and a hammer operate, ultimately, in the same way.  They obey the laws of physics, as they must, and when they are placed in a given situation, they will affect their surroundings in a predictable way. 

Raise the hammer up high and propel it downward, providing it energy through the use of your muscles and with the assistance of gravity, and, given the Newtonian principle of inertia when the hammer encounters an object along its path of motion then the effect will be to impart kinetic energy to the object, and thus effect it in some way.  The design of the hammer is such that it reliably causes nails (which were also designed with an understanding of physics in mind) to penetrate wood and other porous materials.

Computers are designed to operate in consideration of the laws of magnetism, electricity, thermodynamics, and so forth.  Press the key on your keyboard that is inscribed with the "A" symbol, and you will set into motion a series of events within the device, each a practical application of a principle of physics which will ultimately cause a particular pattern of phosphors on your monitor or LCD panel to change color.  To you, the human operator, the letter "A" will have "been entered" into the computer, but this is only your interpretation.

In fact, a series of electromagnetic interactions will have taken place, and nothing more.  The humans who designed the computer and the software it is running arranged for this illusion. Therefore, it tends to be consistent with our expectations since we are like-minded with those designers.

This is an off-putting notion at first, considering the enormously complex ways in which we make use of our computers.  But it is easier to accept when we consider other, equally complex interactions that we engage in with technology.

Television is a good example.  When I watched "The West Wing" on my television, I understood that Martin Sheen was not actually "in there." I knew that I was actually watching a pattern of colored light that appeared to "be him." And I did not ascribe the entertainment which I had experienced to the television itself: I would never exclaim "my television sure was dramatic this week!"  I understand that it is the people behind the recorded presentation that are actually "creating the story."

We tend to forget this concept when we interact with computers, however.  We think of the computer "doing" things all the time – calculating payroll, showing us monsters and letting us shoot them, recording our emails and letters and printing them, etc...  But this is no more "real" than the seemingly 6-inch-high actor inside your television is real.  The interaction with the computer is more complex than the interaction with the television, and it is far more complex than the interaction with the hammer, yet it is essentially the same: we are affected in a given way by the activity of the device, and we find this effect entertaining, constructive, enlightening, or otherwise useful.

This leads me, however, to reconsider what we are doing when we create systems. 
At the very least, we have to acknowledge that software, like the computer, does not do anything.  People do things and can use automation as a tool to help them.  Perhaps it is easier to keep this logically in mind if we say it this way: machines cannot accomplish anything.  Accomplishments are relegated solely to people.

For some, I think it will make it more natural to become validation-centric rather than function-centric in the way they design and develop features.  In other words, it moves us away from "does it work?" as the primary concern, and toward "is it what the users need to accomplish things?" 

Why does this matter to an agile practitioner?  I'll get into that next week.