When you’re in the early stages of site selection or roughing out an education program, reliable data is key. If you don’t consider the source, methodology, and timeliness of the information you’re using, you could make a decision that causes you headaches down the road.
This problem is by no means unique to the planning community. Convene recently talked to John Wihbey, who is with Journalist’s Resource, a media and research project run by Harvard’s Shorenstein Center on Media, Politics, and Public Policy, about how the tactics that journalists use to cut through the noise of academic papers and report the truth can be applied within the meetings industry.
“It seems every month or so there is a scandal or controversy over the validity of a study,” said Wihbey, who with Justin Feldman, a doctoral student at the Harvard School of Public Health, recently co-authored a whitepaper titled “Eight questions to ask when interpreting academic studies: A primer for media.” “There is a lot of hand-wringing and lament that journalists generated big headlines over something that turned out to be dubious, and were complicit in the whole thing.”
The whitepaper offers common-sense dictates for anyone who must judge how reliable data is. Wihbey and Feldman recommend paying close attention to the design of a study, because certain models are better than others for determining causation. Systematic reviews or meta-analyses of ongoing research in a field are more valuable than one-off analyses of several variables within a small population, for example, or opt-in survey results.
“Self-reported surveys are not as scientific as randomized control trials,” Wihbey said. “Even if you randomly select the people you approach, there is always a pretty decent chance that people who choose to respond fully are more motivated or interested in some way than the people who choose not to.”
It’s unlikely that every new study will be a game-changer, so it’s important to contextualize an academic paper’s hypothesis with conclusions from one or two similar studies. “We don’t mean to dispute the model of scientific and academic research, even if we think it can be usefully challenged on a case-by-case basis,” Wihbey said. “We just want journalists to be aware of what’s sometimes called ‘single-study syndrome’ and to report with subtlety and context. Despite all of the questions about certain academic studies, they are still — taken as a whole — built on a more reliable model than almost any other model.”
Bottom line: Don’t settle for glosses, takeaways, or listicles. Whether you’re perusing a paper on how different attendee-learning models stack up, or reading a study on green practices in the hospitality industry, you need to look critically at what type of data is being used and how it was collected before you can decide how relevant it is to your business practices — or book one of the authors as a speaker.