When I read an article written by Eric Garland, a strategic analyst, I was disheartened. Garland recently quit his job of 15 years as an analyst, and he expressed his frustration with decision makers in an article published in the Atlantic titled, “How So-Called Strategic Intelligence Actually Makes Us Dumber.” What is your reaction to this excerpt from his article?
“I am not quitting this industry for lack of passion, as I still believe – more than ever – in using good information and sophisticated analytical techniques to decode the future and make decisions. The problem is, the market for intelligence is now largely about providing information that makes decision makers feel better, rather than bringing true insights about risk and opportunity. Our future is now being planned by people who seem to put their emotional comfort ahead of making decisions based on real – and often uncomfortable – information. Perhaps one day, the discipline of real intelligence will return triumphantly to the world’s executive suites. Until then, high-priced providers of “strategic intelligence” are only making it harder for their clients – for all of us – to adapt by shielding them from painful truths.”
My reaction was he has a point. Relying on intuition, self-interest, and office politics runs the risk of confirmation bias – believing in an answer before viewing the facts.
A preconception with bias
Weak leaders are prone to a preconception bias. They can be blind to evidence and somehow believe their intuition, instincts, and gut-feel are acceptable masquerades for having fact-based information.
Psychologists refer to this as a confirmation bias. What often trips managers up is they do not start by framing a problem before beginning to collect information that will lead to their conclusions. They often subconsciously start with a preconception. That is, they seek data that will validate their bias. The adverse effect is they prepare themselves for X and Y is actually happening! By framing a problem and considering alternative points of view, one widens the options to formulate hypothesis. And this is where the emerging discipline of analytics fits in. With fact-based information, organizations gain insights and views that they might otherwise have missed.
Mental shortcuts, gut feel, intuition and so on typically work except when problems get complex. When problems or opportunities get complex, then a new set of issues arise. Systematic thinking and application of analytics are required.
In the book Analytics at Work: Smarter Decisions, Better Actions co-author Jeanne G. Harris of Accenture notes that forty percent of important decisions are not based on facts but rather on intuition, experience, and anecdotal evidence. An immediate impression is this so sad. However, one ideology can take the position that perhaps intuition and experience is reliable for decisions – if the decision-maker has exceptional intuition and experience. But that is a pre-requisite. What if it doesn’t sufficiently exist? Just look at the 2008 global economic meltdown. There were many smart minds managing the global economy. And look at what happened.
Analysts’ imagination sparks creativity
In contrast, a curious person, which is a trait of analysts, always asks questions. They query data to answer questions, and then use analytics to ask further and more robust questions. And better yet, their analytics can answer their questions.
Analysts typically love what they do. If they are good with analytics, they infect others with enthusiasm. Their curiosity leads to imagination. Imagination considers alternative possibilities and solutions. Imagination in turn sparks creativity.
When analytics are applied, they take “the thumb off the scale” that managers use to influence the result of something that is in their favor. Eric Garland was a victim of too many heavy thumbs. As analytics is increasing embraced, fingers on the scale can be removed.