IR Folks from Times Past

IR Folks from Times Past

Sunday, May 22, 2011

The Risks of Quantification

From the Harvard Business Review blog, an essay by William Byers
Quantification — describing reality with numbers — is a trend that seems only to be accelerating. From digital technology to business and financial models, we interact with the world by means of quantification.
While we all interact with the world through more-or-less inflexible models, mathematics contributes to this lack of flexibility because it is seemingly precise and objective. Even though mathematical models can be very complex, you can use them without understanding them very well. A trader need not really understand the financial engineering models that he may use on daily basis. This uncritical acceptance amounts to the assumption that reality is identical to our rational reconstruction of reality — for example, that the economy or the stock market is captured by our latest model.
Hitting the bottom line is certainly easier when you know with precision what that bottom line is. Numbers give you this precision, making you feel more secure about where you are going — higher revenues, lower costs. Defining the risk of an unlikely event may make us feel like we've dealt with the threat. This is part of the contribution of the "quants" on Wall Street, people with doctorates in math and physics. They bring with them a culture of quantification that they carry over from the mathematical and physical sciences and then apply to economic and financial situations. The complex mathematical models that they use may be brilliant, but the better they are the greater their potential is to misrepresent the actual, human situation that they are looking at. Because they give people like CEOs and politicians the idea that everything is understood and under control, these models often have the perverse effect of generating new, unanticipated problems.
What we misunderstand is that introducing mathematics and quantification into any situation subtly changes that situation and this needs to be taken into account. We are attached to our analytic and computational tools and have become blind to their limitations.
Consider the following four issues:
  1. Statistical models are all based on the notion of randomness, but no one can really understand randomness. Many people use the word random without realizing that random means what it says — randomness cannot be predicted or controlled. A model of randomness is no longer true randomness.
  2. Because they are logically consistent, mathematical models screen out ambiguity. Ambiguity is real, but business and financial models have little to no room for it. Ambiguity arises whenever there are two (or more) courses of action that are equally important yet conflict with one another. For example, we need nuclear reactors to solve some of our energy needs and to mitigate global warming. At the same time, nuclear power, as we see from Japan, comes with huge downsides. This same ambiguity is at the heart of most policy and business decisions, but mathematical models have taken out the ambiguity.
  3. Putting a situation into numbers enforces the belief that things are linear and events are necessarily comparable since for any two numbers one is larger and one is smaller. Suppose that you modeled teaching on a scale that goes from 1 to 10. You would conclude that a teacher with a score of 8.3 was better than one with 6.8, but this would inevitably ignore all kinds of qualitative factors. Models simplify things by ignoring certain aspects of the situation, but we can never be sure which factors should be included and which should be omitted.
  4. Any system that involves human behavior, like economics or finance, is inherently self-referential. We are all participants in, not just observers of, these systems. Self-referential systems are notoriously difficult to control and predict because they are often chaotic, not deterministic. Most mathematical models do not take this element of self-reference seriously and anyhow chaotic systems, by definition, cannot be predicted.
How do we make use of the realization that uncertainty is inevitable and our best models have blind spots? Simple acknowledgment that uncertainty is unavoidable already changes things in a fundamental way. However, this acknowledgment must be authentic and must overcome our psychological aversion to uncertainty. Uncertainty tends to make situations more complex and, therefore, projects more costly — another reason why it tends to be ignored or downplayed. Admitting uncertainty means facing reality — and our own needs for security..
But admitting uncertainty is not enough. We must learn to actively embrace uncertainty and work with ambiguity. There are two reasons for this. First, as I discuss in my books, ambiguity is not only a part of the world we are modeling, but is even present in the mathematics itself. Second, the creative breakthroughs we are looking for will only come by actively engaging ambiguity and the uncertainty it brings in its wake.
William Byers is a professor of mathematics at Concordia University in Montreal and the author of The Blind Spot: Science and the Crisis of Uncertainty and How Mathematicians Think: Using Ambiguity, Contradiction, and Paradox to Create Mathematics.