Welcome to issue #5 of Coffee Talk.
Feature
Decision-Obsessed > Data-Obsessed
Image by Lorenzo Cafaro from Pixabay
If you work in corporate America, you’re likely familiar with modern management’s obsession with data. You might also be familiar with corporate America’s two favorite quotes about data:
If you can’t measure it, you can’t manage it.
And...
In God we trust, all others bring data.
Both these quotes are used to reinforce the importance of data, but ironically they both ignore important context (ie. important data) that changes the meaning of the words themselves. In the first case, a quote often attributed to Dr. William Edwards Deming, half the quote has been left out. Deming’s view on the quote was this:
It is wrong to suppose that if you can’t measure it, you can’t manage it - a costly myth.
While Dr. Deming was a lifelong proponent of leveraging data for effective management; he also believed that there are many things that cannot be measured but still need to be managed. He called running a company on visible figures alone one of the seven deadly diseases of management.
In the 1940s the US government created the Statistical Research Group, a collection of Nobel Prize winners and Ivy League graduates that represented the country's finest statistical minds. In true Deming fashion, the group's mission was to leverage data to better manage the war effort abroad. Jordan Ellenberg describes one of the more difficult problems the group faced in his book “How Not to Be Wrong: The Power of Mathematical Thinking”. The problem was this:
You don’t want your planes to get shot down by enemy fighters, so you armor them. But armor makes the plane heavier and heavier planes are less maneuverable and use more fuel. Armoring the planes too much is a problem; armoring the planes too little is a problem. Somewhere in between there’s an optimum. The reason you have a team of mathematicians socked away in an apartment in New York City is to figure out where that optimum is.
The team was provided some data to help make the decision.
The military officers who collected the data thought there was an opportunity to increase protection on the areas of the plane that were being hit the most (ie. the Fuselage). Based on the data collected, that was a sound idea. There were more bullet holes in the Fuselage than any other key part of the plane. They came to the Statistical Research Group asking how much more armor they should put on the Fuselage. But they got a different answer:
The armor doesn’t go where the bullet holes are. It goes where the bullet holes aren’t: on the engines.
Why? Because the army was measuring the bullet holes on the planes that came back, not the planes that were shot down.
The large number of planes returning to base with a thoroughly Swiss-cheesed fuselage is pretty strong evidence that hits to the fuselage can (and therefore should) be tolerated. If you go to the recovery room at the hospital, you’ll see a lot more people with bullet holes in their legs than people with bullet holes in their chests. But that’s not because people don’t get shot in the chest; it’s because the people who get shot in the chest don’t recover.
So it was the data that couldn’t be measured, the bullet holes on the downed planes, that really mattered. Follow the visible numbers too closely, one of the seven deadly sins of management, and it’s easy to make mistakes.
The second quote, “In God we trust, all others bring data.”, is a quote with a large number of variations, contexts and attributions. One particularly notable use of the quote was at NASA’s Johnson Space Center where the words hung on the wall over the mission ops control room. The implication for modern managers is, “if it worked for NASA it’ll work for us”, but there’s a cautionary story behind to be told.
Before the Challenger launch in 1983 an engineer named Bob Ebeling warned his managers of a problem with the rocket's O-rings. By his estimation they weren’t safe to operate at cooler temperatures and NASA needed to wait for temperatures to rise before launch. But the management team had known about the O-ring problem as far back as 1977. O-ring erosion was also noted on the Columbia mission in 1981 and was never addressed. After multiple flights that showed erosion but did not result in catastrophe, management assumed the probability of risk was low and they treated the O-ring issue as “an acceptable flight risk”. For Ebeling, an engineer who knew the function of the O-ring intimately, it was anything but acceptable.
The Challenger is going to blow up. Everyone is going to die.
Those were Ebeling’s words just before Challenger launched. But NASA managers trusted in only God and data, not Bob Ebeling. Or rather NASA managers trusted a small set of hard historical data instead of Ebeling’s long history of qualitative data collected working on the design of the O-rings themselves. When NASA’s Chief of Safety and Missions Assurance, Bryan O’Connor, retired in 2011, he commented on the negative cultural consequences created by NASA’s disposition towards data:
When I first showed up at the Johnson Space Center, they had a plaque over the wall in the mission ops control room that said something to the effect of, ‘In God We Trust - All Others Bring Data.’ That was quite intimidating to a new person, because between the lines it suggested that, ‘We’re not interested in your opinion on things. If you have data, we’ll listen, but your opinion is not requested here’. A lot of us came to NASA after years of flight testing and R&D work and so on. After the Challenger accident, I really beat myself up for being too silent in the first few years that I was there, and I said to myself, ‘This agency isn’t as smart as it thinks it is.’
One of the challenges with being “data-driven” is deciding which data to follow. Hard data that you can plot on a chart? Or qualitative data from people’s experiences, like Bob Ebeling’s?
Tricia Wang, a former ethnographic researcher at Nokia, ran into a similar problem as Ebeling. After spending years in the late 2000s living in China and studying the evolution of Chinese consumer culture, Wang reached the conclusion in 2009 that China would become one of the world’s largest smartphone markets. This was at a time when the value proposition of an expensive, battery draining, notebook sized smartphone was not especially evident. But after years of observing Chinese people evolve and make financial sacrifices to bring smartphones into their lives, she was adamant of the opportunity. As adamant as Ebeling was that Challenger’s O-rings would fail.
Unfortunately Nokia managers had seen no survey data to support her claims. All their survey data suggested that consumers did not want expensive, battery draining, oversized phones. Wang’s research was shot-down and the consequence was that Nokia’s dominance in China rapidly evaporated.
The solution Wang proposes? Use a combination of “big data” and “thick data” to make decisions. Instead of concluding that their consumers didn’t want smartphones, Nokia could have concluded that consumers don’t want smartphones as they currently exist. The answer would have been to invest in more battery efficient, slimmer and less expensive versions of the smartphone, not abandon the idea of the smartphone altogether.
While NASA had a cultural bias to ignore ‘thick’ data in favor of big data, Nokia spent the money to invest in both thick data (by paying ethnographers to collect qualitative data over long periods of time) and big data (through massive consumer surveys). Their failure came in interpreting that data in a useful way. Nokia proves that the mere presence of data, and ambition to make data driven decisions, is not a panacea. The prerequisite to data of any kind is people who are well equipped to leverage it.
Whether it’s big data, thick data, or hardly any data at all, the key to success is making smart decisions. That’s something even the world’s most successful organizations struggle to do. Even with all the necessary information in front of them. The cost can be lives lost and billion dollar markets handed to the competition. Before obsessing over data, we should all do ourselves a favor and obsess over clear, thoughtful decision-making. Otherwise, data is futile.
One Word
Bricolage
The term "psychological bricolage" is used to explain the mental processes through which an individual develops novel solutions to problems by making use of previously unrelated knowledge or ideas they already possess. - Wikipedia
Quotables & Quizzicals
“We fetishize data, we think that data is the answer. It’s far from the truth. In fact, it’s ridiculous, because the data is only a simulacrum of reality in the same way that a map is not a territory. And so while we need to use information and data to make decisions as we need to do, the data is always unfaithful, always unreliable, it always misleads, and you have to torture it until it confesses” – Kenneth Cukier, Data Editor, The Economist
How do I know if I’ve made the right choice?
Worth Reading
“How social networks got competitive again” - Platformer
Facebook has competition again! Welcome TikTok and Clubhouse.
“Citibank just got a $500 million lesson in the importance of UI design” - Arstechnica
How one honest mistake on an outdated piece of software caused Citibank to send $900M instead of $7M.
Two by Two - Scott Galloway
Galloway provides an interesting breakdown of where modern media platforms sit in the competitive landscape.
Credits
How Not to Be Wrong: The Power of Mathematical Thinking - Jordan Ellenberg
Bricolage - Wikipedia
The human insights missing from big data - Tricia Wang, Ted Talk
Interview with Bryan O'Connor - Ask Magazine
Challenger engineer who warned of shuttle disaster dies - Bob Ebeling
Myth: If you can’t measure it, you can’t manage it - The Deming Institute
The Economist’s data editor on data fetishism - What’s the big data
In God we trust, all others bring data - Quote Investigator