Communication

Start Making Sense: Sense About Science Coming to US Communication
"The Puzzled Voter" by Thomas Hovenden

Start Making Sense: Sense About Science Coming to US

September 11, 2014 1274

Puzzled man

“The Puzzled Voter” by Thomas Hovenden

Sense About Science, the 10-year-old UK-based charitable trust focused on helping “people to make sense of the evidence behind issues that matter in today’s society,” is coming to America.

While that’s not exactly news—an announcement was made at the last American Association for the Advancement of Science annual meeting for example–but Sense About Science USA’s Trevor Butterworth put meat on those bones during an hour-long talk and webinar Tuesday with the American Association of Publishers. Butterworth, an Irish journalist with long experience both reporting from America and exposing innumeracy in media stories about science, walked through some of the projects Sense About Science USA will present when it’s up and running “in coming months.”

The basis of Sense About Science –which is a partner of Social Science Space – is that many among the public, the media and policy crowd don’t really understand the numbers they come across. And yet their decisions based on research data – whether made at the breakfast table, the boardroom or the legislative chamber—are inestimably important and should be based on the best evidence, and best understanding of that evidence, available.

Given his background, Butterworth focused his remarks Tuesday mostly on the sad history of media missteps and how Sense (and allied organizations) can hope to correct future ones.

“The trouble with numbers in journalism is actually quite old,” he explained, citing research from 1974, “when the popular press actually had dedicated science journalists.” In that research by James Tankard Jr. and Michael Ryan, scientists surveyed showed great apprehension about of the accuracy of newspaper science reporting: results suggested 91 percent of science stories had some sort of error. . Journalists might not have fully agreed with that assessment— problems included not providing detail on methodology, something journalists likely wouldn’t see as an error. Nonetheless, the innumeracy of journalists is legend, and not entirely unearned; irrefutable mistakes or misreadings do occur. In addition, what might seem like minor and exclude-able details to a writer, say about sample size or dosage, may be critical to understanding the true impact of new research.

Then there’s the difference between literal and contextual accuracy: The individual parts of story often are correct, yet the sum of them when strung together may be incorrect or misleading.

In that sense, much “wrong” journalism isn’t wrong based on a literal reading of words in a paper or quotations from a researcher: those words were penned or uttered. But by not putting a study or scientists in the setting of what has been gone before, and by avoiding the caveats that likely surrounded the “accurate” quote, the resulting story isn’t conveying the impact—and by extension the best policy options–accurately.

Butterworth has been wrestling with these issues in his other role as editor-at-large for Statistical Assessment Service, a 20-year-old non-profit project currently affiliated with Virginia’s George Mason University. STATS “advocate[s for] scientific and statistical methods as the best way of analyzing and solving society’s problems,” and offers resources, including vetting stories, to journalists feeling out-of-the-league in reporting on numbers. In general, STATS finds a media that often can’t separate causation from correlation, places too much emphasis on a single study or researcher, and that “is incapable of handling complex data,” Butterworth summarized. Wikipedia, he argued, does a better job explaining science accurately than does the mainstream press. (At times STATS takes its own lumps, sometimes for its analysis and sometimes for its funding.)

The media isn’t the only issue, he continued; “the problem is not just that journalists were getting it wrong, but that scientists were getting it wrong.” He cited the work of Stanford’s prolific John P. A. Ioannidis, starting with the seminal 2005 paper, “Why Most Published Research Findings Are False,” about the reliability of what should be trustworthy original sources for an informed public. At least those findings flowed from peer-reviewed work. Butterworth decried the phenomenon of ‘pay-to-play’ journals which do not offer peer-reviewed science, and yet might seem legitimate to both young scholars and the public.

For these reasons, Sense has made explaining the essential nature of peer review one of its key messages, and will continue to address that theme for both the academy and the public. And Sense offers other tools for improving understanding of research and reporting on it.

Butterworth noted two projects that share DNA with Sense About Science: the newly begun Meta-Research Innovation Center (METRICS) at Stanford, which hopes to “to improve the reproducibility, efficiency and quality of scientific investigations;” and the Ask for Evidence project at Emerson College.

The Emerson campaign asks students to evaluate evidence or scientific claims made by others, such as product marketers or politicians, and then report their own findings. The young scholars are urged to ask A4E or Sense for help in evaluating the science or the numbers. This campaign, and a broader project that will bring in Sense About Science USA, is modeled on the British Sense About Science A4E effort.

A second Sense About Science USA project will be the All Trials campaign, also an import from the UK, which asks pharmaceutical companies to register their clinical trials and then reveal the underlying data those trials generate. This, said Butterworth, will commit the drug companies to transparency and build trust with the public.

A third effort will see the Sense team look at and evaluate science stories in the media in as close to real-time as possible. Since even that is still reactive, Butterworth said, Sense is working to set up seminars to help journalists learn how to tackle numbers- and science-based reporting and have a venue to vet or analyze unpublished stories for those writers on the fly. “Not everyone can be Nate Silver,” he said, referencing the famed number-savvy pundit, but everyone can use an expert’s help now and then.


Business and Management INK puts the spotlight on research published in our more than 100 management and business journals. We feature an inside view of the research that’s being published in top-tier SAGE journals by the authors themselves.

View all posts by Business & Management INK

Related Articles

Sixth Edition of ‘The Evidence’: We Need a New Approach to Preventing Sexual Violence
Bookshelf
July 26, 2024

Sixth Edition of ‘The Evidence’: We Need a New Approach to Preventing Sexual Violence

Read Now
Stop Buying Cobras: Halting the Rise of Fake Academic Papers
Communication
July 22, 2024

Stop Buying Cobras: Halting the Rise of Fake Academic Papers

Read Now
New SSRC Project Aims to Develop AI Principles for Private Sector
Industry
July 19, 2024

New SSRC Project Aims to Develop AI Principles for Private Sector

Read Now
Let’s Return to Retractions Being Corrective, Not Punitive
Communication
July 15, 2024

Let’s Return to Retractions Being Corrective, Not Punitive

Read Now
Uncovering ‘Sneaked References’ in an Article’s Metadata

Uncovering ‘Sneaked References’ in an Article’s Metadata

The authors describe how by chance they learned how some actors have added extra references, invisible in the text but present in the articles’ metadata, when those unscrupulous actors submitted the articles to scientific databases.

Read Now
Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work

Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work

Megan Stevenson’s work finds little success in applying reforms derived from certain types of social science research on criminal justice.

Read Now
Fifth Edition of ‘The Evidence’: Do Peacebuilding Practices Exclude Women?

Fifth Edition of ‘The Evidence’: Do Peacebuilding Practices Exclude Women?

The June 2024 installment of The Evidence newsletter puts post-war conflict resolution practices under the microscope – taking a closer look at how women are adversely affected by these peacebuilding exercises.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments