Americans have a love-hate relationship with data. In our businesses and in our personal lives—data flows through everything we do: It informs decisions big and small, it mitigates our sense of risk, it lends us competitive advantage, it helps us to know ourselves and our world better. But there is a tension emerging—stemming from a distrust of technology, of expertise, of science itself. A recent survey showed more people trust their hairdresser to tell the truth than an economist. Clearly, something is afoot.
Together with my expert panel, we discussed our cultural relationship with data and identified seven themes that highlighted this tension.
1) Data lacks a shared definition
VML’s Mark Donatelli was quick to point out “Data and analytics are like saying ‘food and drink’…data is everything from a customer satisfaction survey, to the weather last Tuesday, to a stock price.” Data come in many flavors, and, like any good recipe, you have to know which ingredients to mix to yield the right outcome.
2) Over-reliance on quantitative data
Yet for many “data” is equivalent to numerical information—statistics and mathematical models. Quantitative data have an enticing psychological allure. That’s because, as the Association of National Advertiser's Kiran Goojha explains, “No one looks at [quant data] with a grain of salt, they look at is as absolute. [When] it’s presented numerically, it feels like fact and people don’t realize there’s interpretation there.” In part, this may be due to the intimidation factor many Americans feel when working with numbers. According to a 2015 Program for International Student Assessment, Americans rank just two places above Kazakhstan in our students’ performance in math.
The result is that we’re outsourcing our critical analysis to others when faced with numbers. As Kiran explains, “The layman is not going to pull his own data, so he relies on organizations to do it for them and then they also attribute a whole lot more weight to the data than they should.” C-suite executives not well versed in data analytics can fall into the same trap, as Siegel + Gale’s Brian Rafferty suggests: “It’s hard to know when you’re seeing something that’s right versus seeing something that sounds right. How do you really evaluate that?”
3) New and timeless sources of insight are underutilized
Beyond the lack of critical analysis, Rosetta’s Ed Falconer explains, there can be an over-reliance on traditional methodologies. He cites the recent U.S. election as a perfect illustration of this challenge: “There were a number of commentators who used big data—social data and otherwise—and they were predicting a totally different outcome than the more traditional research approaches.” Whereas traditional polls ask people to self-report in a way many may feel uncomfortable with, big data, Ed explains, “recognizes signals and signs of intent.”
But there is yet another source of data that is often under-leveraged: actual humans. As Brian explains, you often can get more from “talking to people, watching people than you can get from looking at a spreadsheet.” The more you study humans and how the real world works, he suggests, the better able you’ll be at assessing the validity of quantitative data, because you can ask yourself “does this reflect how things work in the real world?”
“You get served up your future…instead of you being the one who’s designing it.”
Some of the excitement with data analytics—especially in the widespread adoption of artificial intelligence—is based on its capacity to predict future behavior. “We’ve gotten good at predicting behavior and we’ll get better as more data becomes available,” Ed explains. “But,” he cautions, “predicting behavior depends on the audience acting the way they have in the past.” That rear-view focus may cause an analyst to miss critical outliers that could signal larger cultural shifts, as we saw with the 2016 election.
At a cultural level, experiences designed to match past behavior and preferences might stifle exploration, discovery, and diverse points of view. Brian notes we may already be seeing that emerge with personalized news feeds and content: People “seem to be entrenching into singular, non-evolving points of view…there’s [the risk of] de-individualization of people, because you’re always being treated as a predictable entity.”
Ed recommends a new opportunity: Instead of just leveraging data to predict behavior, use data to react to individual behavior—in context and in the moment. Improvements in what Ed calls "reactive data" will make it possible for customized interactions based on the present, not on the past.
“Analytics is often misleadingly persuasive.”
As Chuck Klosterman notes in his book But What if We’re Wrong?: “Numbers have no agenda, no sense of history, and no sense of humor.” But humans do.
When humans compute, code, or build an algorithm, they are building it with human assumptions. Whether talking about artificial intelligence or an analytics report, Mark explains, “In the end, [the output] takes on the POV of its creator.” Ed echoes the sentiment: “Analytics is often misleadingly persuasive. It helps to frame things as fact, but it’s actually very subjective. And the use of this framing can be purposeful in order to manipulate…it’s a terrible use of data.”
Mark suggests reframing analytical reports as editorial: “I often use the word 'opinion'—a report is someone’s opinion of the data. Someone looked at the data and made a statement.” By remembering the editorial nature of data analysis, the consumer will be more apt to apply a critical lens.
“If you torture data sufficiently then it will confess to almost anything.”
Editorializing, of course, often comes with an agenda. Whether in the realm of marketing, politics, or even science, Wire Stone's Jon Baker says, “[Humans] take data and squash it to support their theory….If the data says it’s a failure, are they going to report that? No, they’re going to squash the data and reinterpret it to support what they want to say. The relationship between data and humans is so inextricable.” This tendency, conscious or not, to skew what appears to be objective data to fit our narrative is one of the central reasons trust in data analysis will continue to be debated into the future.
At a cultural level, the ability to manipulate data to reinforce our beliefs can be disastrous. Chuck Klosterman notes, “We’re starting to behave as if we’ve reached the end of human knowledge…the sensation of certitude it generates is paralyzing.” Rather than be over-confident in our own certainty, he says, “we must start from the premise that—in all likelihood—we are already wrong. And not ‘wrong’ in the sense that we are examining questions and coming to incorrect conclusions, because most of our conclusions are reasoned and coherent. The problem is with the questions themselves.”
As data becomes easier to access, analyze, and apply, it will be critical that data scientists and enthusiasts continue to ask themselves fundamental questions that challenge their own presuppositions.
7) Scrutinizing the source
Asking questions of reported data is a skill not everyone applies in equal measure. Mark recalls, “Many decades ago an expert could spout a statistic and no one would question it. Now, people demand to know the source, how it was calculated, what were the contributing sources.” Yet the proliferation and accusations, of “fake news” highlight that this kind of scrutiny is ever more urgent. Mark suggests asking two key questions of any presented statistics: “‘What is the source?’ and ‘Compared to what?’ That will help you to understand what data is making up this opinion.”
While these tensions between the promise and pitfalls of data may continue to unfold in our larger culture, business leaders are facing their own set of challenges. Read on to Part II to discover what actions business leaders can take in 2017 to create an effective, data-driven enterprise.
Click here for Part III on the challenges and opportunities for job seekers in data analytics.