HOME

UP

Excerpt from: WHY DOES SOFTWARE COST SO MUCH? (And other Puzzles of the Information Age) By Tom DeMarco

The lesson of history, though often stated simplistically, is never simple, and we do end up repeating it again and again. Our own field of software development is a perfect example. We seem to be stuck in a giant loop, repeating the same dumb mistakes in one project after another. After all these years, we still can't estimate, we still can't believe that we can't estimate (so we trust the latest set of numbers, even though the last few hundred sets were proved unrealistic), we still can't specify, we still can't reuse much of anything we've built before, and we still can't deliver software for what consensus dictates is the "right" price or in the "right" elapsed time.

On one project after another, we excuse doing things in a way that everyone knows is wrong because "there isn't time to do it right." We code before design, we design before specification, we specify before understanding the requirements. Then, at the end, we have a Lessons Learned meeting, where we point out that we really shouldn't do any of those things. Lessons Learned sessions are all the same (I know, I attend a lot of them). The people and the organizations and the applications may be different, but the actual lessons learned are the same. People raise their hands and grumble, "We really shouldn't set schedules based entirely on what Marketing would like to see and without regard to how much work there is to do." Everyone nods sadly. Another lesson learned. Again. Am I the only one who ever wonders why we keep relearning lessons we thought we had learned years ago?

THE TRANCE STATE CONJECTURE

When you take a wrong turn, then realize it's a mistake and backtrack, an automatic learning process is invoked that helps you avoid the mistake in the future. The process is not infallible, though. If you don't travel the same route for a few years, you may well make the same mistake again, but probably only one more time. The reinforced learning process is powerful. The second backtracking is almost certainly accompanied by a good deal of your muttering, "What a dummy!"

If you travel the same route every day and every single day make the same mistake, taking exactly the same wrong turn, there is something else going on. It's not that you haven't learned, but rather that your learned response is being overwhelmed by some unconscious but dominating need. I suggest the need is unconscious, because if it were conscious, you would understand its role in your route choice and not think of the turn as a "mistake.' You might see it as unwise or imprudent or hopeless, but it is not a simple error.

When your actions are driven by forces that you don't completely understand, you are by definition in a kind of trance. The forces that drive you are hidden rules.

The propensity of software organizations to make the same mistakes again and again leads me to believe that these organizations are in a trance. On a conscious level, they believe their decisions are governed by clearly articulated and widely known rules like

  Keep quality high

  Leave time for unanticipated problems

  Respond to user needs

  Work hard

  Keep promises

but there are also hidden rules at work. These hidden rules work on us without our noticing. They are universal, or nearly so; though they are never stated, we all understand them. When a hidden rule is in conflict with one of our publicly stated rules, the hidden rule almost always wins out.

The task I've set myself in this essay is an impossible one: to reveal all the hidden rules so we can acknowledge their influence and deal with them sensibly. Of course, I can never figure out what all the hidden rules are (after all, they're hidden), and I can't expect to know what special hidden rules may be working on you. But I do see a few that seem to apply widely.

In the following list, I predict, you'll find a number of them that are at work within your organization.

HIDDEN RULE #1: FALSE PRECISION OF ESTIMATES

Suppose your boss asks for an estimate of how long it will take the team you currently have assigned to the Platypus Project to get the design wrapped up. And suppose you respond, "From 13 to 68 days." What reaction would you expect?

Few upper managers in my experience would sit still for such imprecision. They expect much tighter tolerances on an estimate, even though past estimating performance has shown such tight tolerances to be completely unjustified.

Common sense requires that the tolerance applied to your next estimate be consistent with the accuracy of your past estimates. If you and your organization have a history of estimates that are from 10 percent pessimistic to 150 percent optimistic, then your subsequent estimates should be packaged with similar error bands around them. Common sense requires this, but here common sense is at odds with a hidden rule. The hidden rule is that managers are supposed to be able to estimate with great precision. They are not however obliged to estimate with great accuracy. In such an organization, you would do better to estimate 15 days and be off by a factor of 4 than to break the hidden rule with an estimate like "from 13 to 68 days."

How much precision does the hidden rule require? Work it out in your own case by considering how wide a tolerance your boss would let you apply. My gut-feel is that tolerances of +- 10 percent are pretty generally accepted, but nothing greater. Since estimates are usually required when 1 percent or less of the work is complete, +-10 percent tolerances are unreasonable to ludicrous. Nobody's estimating history could justify such a narrow confidence band.

The first time you try to use past experience as a guide for setting tolerances on new estimates, you're bound to hear this objection: "But we can't make sensible business decisions with such wide unknowns." Your response ought to be, "Is fibbing going to help?"

HIDDEN RULE #2: POWER SHIFTING

Workers in other fields must complain, as we do, that there is always too much politics in their organizations. But software workers are indeed subject to an extra portion of politics, due to an effect that is seldom discussed. It has to do with subtle changes in the power structure that accompany installation of any new system. The rule is this:

  Any time a new system is installed or an old one changed
  substantially, somebody gains and somebody loses power.

Those who build the system and put it into place are acting as agents for this changed power structure.

The parties who stand to lose the most are often the very ones that the system builders have to interact with in order to understand system functionality. These power-losers know that they are essential to the success of the new system. As you might imagine, they are not reluctant to use their temporary strength to force change on the new system, change that will conserve some of their eroding power. Or if that is not possible, they may use their present position destructively to hurt the system-building process and to make it painful for the builders.

A particularly ugly kind of project is one that effects a corporate consolidation, gathering the reins of power from the outlying regions and pulling them into a head office. The injured parties are many and the consolidators are few (but highly placed). The stakes are high and emotions higher. Nobody escapes from such projects without some damage.

If you fail to recognize power shifting, the politics at work on your project will be incomprehensible to you - incomprehensible and potentially deadly. The trance conjecture tells us that no matter how obvious the power shifts are, you may not be able to see them easily. You need to take mechanical steps to force your eyes to see: Ask yourself, Who stands to gain power? Who stands to lose? And how much? What behaviors are encouraged by these potential gains and losses?

HIDDEN RULE #3: ANGER

At a cocktail party one evening, I got into a discussion with a perfect stranger about our two different lines of work. She was in advertising. When I told her I was in computing, she smiled pleasantly. "Oh, that must be wonderful," she said. "Imagine a kind of work where nobody's feelings ever get hurt!" She assumed that since there is a well-defined, deterministic machine at the center of what we do, there would be no possibility of ugly interaction.

Interaction with the computer itself probably can't injure our feelings. We may be frustrated by a race condition or a subtle bug, but it never makes us feel unloved. There is, however, another side to our work. We spend a lot more time interacting with people than with machines, and, as you know, these interactions can be as ugly as those in any other profession.

The worst interactions involve a lot of anger. When the anger comes down on you from above, it can be particularly upsetting. What's surprising is that anger is routinely used by some managers to effect business purposes. For example, he or she uses fury to spur the project toward on-time completion or reduced cost. The cause of the anger seems to lie outside of the manager's normal emotional domain. If you slept with that manager's spouse, you could understand the resultant emotional outburst, but for a missed schedule? Or a late milestone?

The hidden rule at work here is a complex one having to do with the acceptability of emotions. This rule dictates that some emotions are distinctly not okay. Included in this group are fear and sadness. Other emotions are okay, particularly anger. The rule causes us to substitute an acceptable emotion for an unacceptable one. In most cases, this means showing anger when we feel fear.

Anger in the workplace almost always replaces fear. When a boss yells at a subordinate over a slip or a defect, the boss is scared. When he rages, it means he is terrified.

The rule that anger is okay but fear is not is built deep into the human firmware. There is probably no changing it. But that doesn't mean there is nothing you can do to curb abusive anger by managers. If everyone understands the fundamental workplace truth that

 

        Anger = Fear

 

then abusive anger will become a thing of the past. Managers who scream in rage will only be proving to everyone that they are scared to death. Since fear is not okay, they will have to stifle it. That doesn't solve the underlying problem, but it might at least lessen the impact on powerless subordinates.

HIDDEN RULE #4: FAT

A hidden rule about management is that managing people is not by itself enough to justify managers' salaries. In order to be secure from the next round of cost cutting, managers must be doing something other than just being boss. Managers who spend all their time giving direction to their projects, motivating individuals, and helping teams to form are nothing but (shudder) overhead, pure fat to be trimmed the next time there is trimming to be done.

This attitude affects us in two different but equally deadly ways: First, it contributes to the erosion of management in our organizations, the flattening of hierarchies so that even junior managers may have as many as twenty people directly reporting to them. No one can manage twenty people; setting up an organization with such wide management scope is a denigration of the management function. It assures that no real management gets done.

The second way this hidden rule hurts us is that it entices managers to find other roles to occupy them, in order to justify themselves. They get involved with the technology, serving as the project's chief architect or its ultimate quality gate. They go out as support on sales calls or pitch in with new ideas about extending the client base or marketing new services. Again, the result is that people are left unmanaged.

As Barry Boehm asserts, "Poor management can increase software costs more rapidly than any other factor." The core functions of hiring, directing, and motivating people are what give management this impact. If management is so important, shouldn't we let managers do it? We should, but we don't because the hidden rule says not to.

HIDDEN RULE #5: DENIAL

The final hidden rule has to do with how we handle bad news. It suggests that admitting even tile smallest defeat is defeatism. The manager who says, "Look, we're not going to make the April cutover date; let's start planning now for how we can deliver in July," is viewed as a defeatist. He or she would be better off politically by saying, "June at all cost. We can do it!" even though time went on to prove that June was impossible. Better off, even though late recognition of the facts made July impossible as well.

The attitude that is being thrust upon us is called "can-do management," an attitude that is responsible for more debacles than all other causes together. The can-do mentality has the effect of stopping bad news from moving up a hierarchy. Upper management will never even know that June is a problem because the can-do manager's "We can do it!" is the only message that makes it to the upper levels.

Can-do thinking makes risk management impossible. Since acknowledging real risk is defeatism, the risk management function in a can-do organization is restricted to dealing with those smallish risks that can be mitigated by quick action. That means you confront all the risks except the ones that really matter.

Defeatism is the gloomy tendency to think only of defeat even though victory may well be possible. It is not defeatism to acknowledge a setback. The best organizations deal with setbacks, even major lost battles, all the time.

Can-do management seems to work well only as long as nothing goes wrong. But things do go wrong at some time in most endeavors. The can-do organization then stays the course, ignores the truth that is known at all the lower levels, and thus escalates what might have been a minor setback into a true disaster.

When hidden rules stay hidden, they can do immeasurable harm. They gull us into making the same mistakes over and over and over again. We may never be entirely free of their influence, but we can do better. By seeking hidden rules, naming them, and discussing them in the open, we can hope to deprive them of some of the power they have over us.

Excerpt from:
WHY DOES SOFTWARE COST SO MUCH? (And Other Puzzles of the Information Age)

Author:
Tom DeMarco:

ISBN:
0-932633-34-X (December 1995)

Publisher:
Dorset House Publishing
353 West 12th Street, New York, New York 10014 USA
1-800-DH-BOOKS (1-800-342-6657) 212-620-4053