Communication

Start Making Sense: Sense About Science Coming to US Communication
"The Puzzled Voter" by Thomas Hovenden

Start Making Sense: Sense About Science Coming to US

September 11, 2014 1181

Puzzled man

“The Puzzled Voter” by Thomas Hovenden

Sense About Science, the 10-year-old UK-based charitable trust focused on helping “people to make sense of the evidence behind issues that matter in today’s society,” is coming to America.

While that’s not exactly news—an announcement was made at the last American Association for the Advancement of Science annual meeting for example–but Sense About Science USA’s Trevor Butterworth put meat on those bones during an hour-long talk and webinar Tuesday with the American Association of Publishers. Butterworth, an Irish journalist with long experience both reporting from America and exposing innumeracy in media stories about science, walked through some of the projects Sense About Science USA will present when it’s up and running “in coming months.”

The basis of Sense About Science –which is a partner of Social Science Space – is that many among the public, the media and policy crowd don’t really understand the numbers they come across. And yet their decisions based on research data – whether made at the breakfast table, the boardroom or the legislative chamber—are inestimably important and should be based on the best evidence, and best understanding of that evidence, available.

Given his background, Butterworth focused his remarks Tuesday mostly on the sad history of media missteps and how Sense (and allied organizations) can hope to correct future ones.

“The trouble with numbers in journalism is actually quite old,” he explained, citing research from 1974, “when the popular press actually had dedicated science journalists.” In that research by James Tankard Jr. and Michael Ryan, scientists surveyed showed great apprehension about of the accuracy of newspaper science reporting: results suggested 91 percent of science stories had some sort of error. . Journalists might not have fully agreed with that assessment— problems included not providing detail on methodology, something journalists likely wouldn’t see as an error. Nonetheless, the innumeracy of journalists is legend, and not entirely unearned; irrefutable mistakes or misreadings do occur. In addition, what might seem like minor and exclude-able details to a writer, say about sample size or dosage, may be critical to understanding the true impact of new research.

Then there’s the difference between literal and contextual accuracy: The individual parts of story often are correct, yet the sum of them when strung together may be incorrect or misleading.

In that sense, much “wrong” journalism isn’t wrong based on a literal reading of words in a paper or quotations from a researcher: those words were penned or uttered. But by not putting a study or scientists in the setting of what has been gone before, and by avoiding the caveats that likely surrounded the “accurate” quote, the resulting story isn’t conveying the impact—and by extension the best policy options–accurately.

Butterworth has been wrestling with these issues in his other role as editor-at-large for Statistical Assessment Service, a 20-year-old non-profit project currently affiliated with Virginia’s George Mason University. STATS “advocate[s for] scientific and statistical methods as the best way of analyzing and solving society’s problems,” and offers resources, including vetting stories, to journalists feeling out-of-the-league in reporting on numbers. In general, STATS finds a media that often can’t separate causation from correlation, places too much emphasis on a single study or researcher, and that “is incapable of handling complex data,” Butterworth summarized. Wikipedia, he argued, does a better job explaining science accurately than does the mainstream press. (At times STATS takes its own lumps, sometimes for its analysis and sometimes for its funding.)

The media isn’t the only issue, he continued; “the problem is not just that journalists were getting it wrong, but that scientists were getting it wrong.” He cited the work of Stanford’s prolific John P. A. Ioannidis, starting with the seminal 2005 paper, “Why Most Published Research Findings Are False,” about the reliability of what should be trustworthy original sources for an informed public. At least those findings flowed from peer-reviewed work. Butterworth decried the phenomenon of ‘pay-to-play’ journals which do not offer peer-reviewed science, and yet might seem legitimate to both young scholars and the public.

For these reasons, Sense has made explaining the essential nature of peer review one of its key messages, and will continue to address that theme for both the academy and the public. And Sense offers other tools for improving understanding of research and reporting on it.

Butterworth noted two projects that share DNA with Sense About Science: the newly begun Meta-Research Innovation Center (METRICS) at Stanford, which hopes to “to improve the reproducibility, efficiency and quality of scientific investigations;” and the Ask for Evidence project at Emerson College.

The Emerson campaign asks students to evaluate evidence or scientific claims made by others, such as product marketers or politicians, and then report their own findings. The young scholars are urged to ask A4E or Sense for help in evaluating the science or the numbers. This campaign, and a broader project that will bring in Sense About Science USA, is modeled on the British Sense About Science A4E effort.

A second Sense About Science USA project will be the All Trials campaign, also an import from the UK, which asks pharmaceutical companies to register their clinical trials and then reveal the underlying data those trials generate. This, said Butterworth, will commit the drug companies to transparency and build trust with the public.

A third effort will see the Sense team look at and evaluate science stories in the media in as close to real-time as possible. Since even that is still reactive, Butterworth said, Sense is working to set up seminars to help journalists learn how to tackle numbers- and science-based reporting and have a venue to vet or analyze unpublished stories for those writers on the fly. “Not everyone can be Nate Silver,” he said, referencing the famed number-savvy pundit, but everyone can use an expert’s help now and then.


Business and Management INK puts the spotlight on research published in our more than 100 management and business journals. We feature an inside view of the research that’s being published in top-tier SAGE journals by the authors themselves.

View all posts by Business & Management INK

Related Articles

There’s Something in the Air, Part 2 – But It’s Not a Miasma
Insights
April 15, 2024

There’s Something in the Air, Part 2 – But It’s Not a Miasma

Read Now
To Better Forecast AI, We Need to Learn Where Its Money Is Pointing
Innovation
April 10, 2024

To Better Forecast AI, We Need to Learn Where Its Money Is Pointing

Read Now
Second Edition of ‘The Evidence’ Examines Women and Climate Change
Bookshelf
March 29, 2024

Second Edition of ‘The Evidence’ Examines Women and Climate Change

Read Now
A Community Call: Spotlight on Women’s Safety in the Music Industry 
Insights
March 22, 2024

A Community Call: Spotlight on Women’s Safety in the Music Industry 

Read Now
Charles V. Hamilton, 1929-2023: The Philosopher Behind ‘Black Power’

Charles V. Hamilton, 1929-2023: The Philosopher Behind ‘Black Power’

Political scientist Charles V. Hamilton, the tokenizer of the term ‘institutional racism,’ an apostle of the Black Power movement, and at times deemed both too radical and too deferential in how to fight for racial equity, died on November 18, 2023. He was 94.

Read Now
Did the Mainstream Make the Far-Right Mainstream?

Did the Mainstream Make the Far-Right Mainstream?

The processes of mainstreaming and normalization of far-right politics have much to do with the mainstream itself, if not more than with the far right.

Read Now
SSRC Links with U.S. Treasury on Evaluation Projects

SSRC Links with U.S. Treasury on Evaluation Projects

Thanks to a partnership between the SSRC and the US Department of the Treasury, two new research opportunities in program evaluation – the Homeowner Assistance Fund Project and the State and Local Fiscal Recovery Funds Project – have opened.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments