What Are Variables? What Are “Covariations” with a “Time Lag”? How Do They Reveal System Structure—and Information for Decision-Making?

This webpage provides:

§         What Are Variables and How Can They Interrelate?

§         Why Are Systems Difficult to Understand?

§         What 7 Methods Can help Us Succeed in Working with Systems?

§         A Tip: Choose to Be Part of the 30% (to 50%) Who Check to See If What They Did Worked and NOT the 70% (to 50%) Who Never Checked

 

What Are Variables and How Can They Interrelate?

The word variable is a key term in Dӧrner’s definition of a system as “a network of many variables in causal relationships to one another.” (p. 73) These variables are not passive and may interrelate and thereby have many “side effects and long-term repercussions”:

Complexity is the label we will give to the existence of many interdependent variables in a given system. The more variables and the greater their interdependence, the greater that system’s complexity. Greater complexity places high demands on a planner’s capacities to gather information, integrate findings, and design effective actions. The links between the variables oblige us to attend to a great many features simultaneously, and that, concomitantly, makes it impossible for us to undertake only one action in a complex system. [To act on an interrelated variable is to take many actions at once.]

 

A system of variables is “interrelated” if an action that affects or is meant to affect one part of the system will also always affect other parts of it. Interrelatedness guarantees that an action aimed at one variable will have side effects and long-term repercussions. A large number of variables will make it easy to overlook them. [bold added] (p. 37)

If you would like more detail on variables, including “critical variables” and “indicator variables” and on “well-buffered systems” and feedback (negative and positive) in a system, click here.

Why Are Systems Difficult to Understand?

The challenge of systems is they are intransparent: “one cannot see everything one would like to see.” Beyond intransparency, systems are also detailed, interrelated, and complex as Dӧrner’s visual image of a system highlights:

[Systems are detailed.]

[W]e could liken a decision maker in a complex situation of a chess player whose set has many more than the normal number of pieces, several dozen, say.

 

[Systems are interrelated.]

Furthermore, these chessmen are all linked to each other by rubber bands, so that the player cannot move just one figure alone.

 

[Systems are complex—they have “many interdependent variables.”]

Also, his men and his opponent’s men can move on their own and in accordance with the rules the player does not fully understand or about which he has mistaken assumptions.

 

[Systems are intransparent.]

And, to top things off, some of his own and his opponent’s men are surrounded by a fog that obscures their identity. (p. 42)

If you want more on why it is difficult to understand systems, click here.

What 7 Methods Can help Us Succeed in Working with Systems?

Dörner provides many methods to help us understand and act usefully with systems. These are ones specific to using “covariations” (interrelated variables) over time to understand the structure of a systemAlso see his methods with analogy and other basics and his methods with working with change over time.

 

1.       Observe the behavior of interrelated variables, such as the ones below in the first paragraph, to learn the “structure of a system.” In that first paragraph, Dörner discusses a “predator-prey system,” but not all interrelated variables have to be of that type. Possible application of the method: identify possible variables to see if any of them seem to be increasing or decreasing over the same period or if most of then started in the last decades. Tip: This website proposes a set of variables related to students at this link.

How do we acquire knowledge about the structure of a system? …   Another method…is to observe the changes that variables undergo over time. If we observe in an given ecosystem that an increase in animal population A is followed by an increase in animal population B and if we then observe a decline in population A followed by a decline in population B, we can assume that animals of type B feed on animals of type A and that the two populations form a predator-prey system. (p. 79)

2.       Plot the data, including if your variables are changing over time. Possible application of the method: Find a way to record your data that matches your data, your technical skills, and where you are in your process. Tip: Dörner primarily uses scientific plots. Analysts such as Edmund Tufte, however, have investigated tablular forms and other forms to reveal data; he provides examples in his books. This website provides a variety of examples of ways to plot data.

The observation of covariations, between which there may be a time lag, is one way of acquiring structural knowledge, and all it requires is the collection and integrating of data over time. (p. 79) [Methods for plotting this data so that change over time is visible is a key method.]

3.       Continue to gather data.

Even after we know enough about a system to understand its structure, we must continue to gather information. We need to know about the system’s present status so as to predict future developments and assess the effects of past actions. These requirements make information essential for planning.[bold added]

4.       Be attentive to “time lags” because they may obscure the root cause of the problem: that by “solving problem X, we create problem Y.”

Contradictory goals are the rule, not the exception, in complex situations. In economic systems costs and benefits are almost always at odds…. [bold added] More dangerous are the situations in which the contradictory relation of partial goals is not evident.”

[Gives example of goals of liberty and equality in “days of French revolution.”]

 

Unrecognized contradictory relations between partial goals lead to actions that inevitably replace one problem with another. A vicious circle is commonly the result. By solving problem X, we create problem Y. And if the interval between the solutions is long enough that we an forget that the solution of the first problem created the second one…someone is sure to come up with an old solution for whatever the currently pressing problem is and will not realize that the old solution will create problem X again and send the circle into another cycle.

 

The same thing happens when current problems are so urgent that we will do anything to be rid of them. This, too, can produce a vicious circle in which we flip-flop between two problematic solutions [as in headache/stomach cures]. (pp. 65-67)

5.       Check to see that what you expected to happen did happen.
Caution: In only 30 percent of decisions did participants ask about the “results … of those decisions.” In other words in 70% of the decisions, participants assumed things worked as they had expected.

[P]eople look for and find ways to avoid confronting the negative consequences of their actions. One of these ways is “ballistic behavior.” A cannonball [unlike a rocket] behaves ballistically. Once we have fired it, we have no further influence over it….

 

[W]e would expect that rational people faced with a system they cannot fully understand would seize every chance to learn more about it and therefore behave “nonballistically.” For the most part, however, the experiment participants did not do that. They shot off their decisions like cannonballs and gave hardly a second thought to where those cannonballs might land…. [What does for “the most part” mean? “In the first five years of the teams’ activity, the average control figure was 30 percent; that is, out of a hundred decisions the participants made, they later inquired about the results of thirty of those decisions…. In the second five-year period, control rose to more than 50 percent.”]

 

If we never look at the consequences of our behavior, we can always maintain the illusion of our competence. If we make a decision to correct a deficiency and then never check on the consequences of that decision we can be believe the deficiency has been corrected. We can turn to new problems. Ballistic behavior has the great advantage of relieving us of all accountability. (pp. 177-179)

A Tip: Choose to Be Part of the 30% (to 50%) Who Check to See If What They Did Worked and NOT the 70% (to 50%) Who Never Checked

Dörner identifies good and bad habits of problem-solving. For a list of specific traits and information about each one, click here, To repeat part of quotation above:

They shot off their decisions like cannonballs and gave hardly a second thought to where those cannonballs might land…. [What does for “the most part” mean? “In the first five years of the teams’ activity, the average control figure was 30 percent; that is, out of a hundred decisions the participants made, they later inquired about the results of thirty of those decisions…. In the second five-year period, control rose to more than 50 percent.”]

 

If you want more quotations from The Logic of Failure

For who Dörner is and the basics about systems and systems thinking, click here. For all of the quotations from The Logic of Failure and these topics, click here:

 

 

 

 

For information or problems with this link, please email using the email address below.

WCJC Department:

History – Dr. Bibus

Contact Information:

281.239.1577 or cjb_classes@yahoo.com

Last Updated:

2012 06/04

WCJC Home:

http://www.wcjc.edu/