From a recent email I received (I hope it wasn’t you who sent this to me…):
We just launched a survey to understand the impact of Sales & Marketing data quality. Specifically, what is the relationship between “dirty” data and conversion rates? Preliminary results are indicating the impact of cleaning up dirty data is significant, potentially boosting revenue goal achievement by 10% or more.
The email then requested me to fill in a survey, the results of which they’d already told me! How is that going to yield unbiased results? Of course it’s not.
And it’s just another example of some of the terrible data that’s being gathered, promoted, and sold.
(To be fair, they didn’t use the Beatles Venn diagram…)
No question, we are in an era where data is expected – and even fetishized. The problem is that this seems to be creating a data bubble, where quality is plummeting. “Surveys” are trumpeted where there’s no serious attempt to make sure the population in question is accurately sampled. Graphs are mislabeled. And then they’re turned into infographics so we can all squint at them without learning anything.
And if people are making decisions based on that “data,” we’re all in big trouble.
So, my three obnoxiously-worded suggestions to survive the data mania with your reputation – and your organization – intact:
Don’t conduct bad research
Example: a company I would otherwise have had a lot more respect for has been publishing research that’s just plain awful. They’re conducting it online, with no regard for obtaining a statistically valid sample. It’s literally just a URL on a website, with no screening or validation that the people filling it out are who they say they are (pro tip: lots of people in the survey business fill out surveys just to see how the survey is constructed – I do it, but I always abandon it before completing it – does everyone, though?). And they are publishing the results as if they mean something.
I don’t know these folks except by reputation, but still I’ve brought this to their attention on two occasions. I do hope they hear the same thing from people they actually know, and that they listen. I see what they’re trying to do in terms of marketing, and maybe it’s working for them in the short term, but I think it’s ultimately going to harm their brand, which relies on trust. All anyone has to do is look beyond their headline to see the data doesn’t actually say what they say it says. That can’t be good.
Don’t commission bad research
Survey tools are free to use, but can be really expensive if you find you’ve gone out to the people who are important to you, asked them to take the time to tell you what they think… and then realize you asked the wrong questions. It’s even worse if you’ve paid someone for the survey. When you’re thinking about research, first think about the questions you’re asking. What you really want to know? Involve the end users of the research in the process – but make sure the people actually conducting the research apply best practices (surveys by committee won’t get you what you need).
Often getting the right expertise (and robust results) to conduct research means you need to bring in a third party – but that individual or company should be able to explain at every step how their research conducted aligns with your research objectives. You may need to invest more than you’d expected to really find out what you need to know, and you may need to be open to multiple techniques. But that’s better than burning your list to get unusable data.
Which brings me to..
Don’t use bad research
If you have research but you are concerned about its quality, throw it out. We’re all very impressionable, especially if we see visual elements (such as a graph). This kind of data becomes accepted wisdom in an organization shockingly quickly.
It’s a bitter pill to swallow, because you have to admit you’ve wasted your time and often money, but, if you have research that’s skewed, biased, or incomplete, just put it away. Don’t quote it, don’t show it to people, don’t let it guide your decisions. If you can glean one or two key points, maybe, but otherwise, it belongs deep in the dungeons of your filing system.
It’s important for decisions to be reality-based, but it’s even more important not to pretend we know what the reality is when we don’t.
We’re clearly in an era where the drive for evidence-based decision-making is intense – but we need to make sure quality standards are very high. There’s a difference between having inadequate data (a constant) and bad data (avoidable).
If you’d like to discuss how our research services can help you make better decisions, please get in touch.
Image by Duncan Hull via flickr via Creative Commons.
Great Article Meredith. I think there is definitely an education issue for management as well as other users. A lot of management executives stumble on some of these infographics, white papers, or other report formats and want to act on decisions based on a metric they saw there. Something that may be a complete wash given they don’t understand what the research was based on, who was sampled and how, what biases exist, etc. Ultimately a lot of research is published for marketing purposes of the research owners and they will use the data to show what they want to show in good light. And we’ve talked about this before at length – management needs to work with researchers who understand how to interpret their objectives and questions into research that will enable objective and unbiased answers that will feed the right decision-making … I.e. give them the answers they need to hear rather those they want to hear.
Thanks for the comment! Some of it comes down to translation – of management questions into research objectives and methodologies, and then from the research back to the management questions, both the ones originally asked and new ones that emerge. It’s not a common skill, and not something most managers get a chance to practice very much before they are called upon to make significant decisions.