Reporting on Retractions

A hand erasing a mathematical formula off a blackboard.
Tero Vesalainen/iStock


When Catherine Offord began reporting on hydroxychloroquine, an antimalarial drug that was touted as a potential cure for COVID-19 during the early months of the pandemic, she says, “the idea was just to summarize the news.” In May 2020, after U.S. president Donald Trump’s controversial endorsement of hydroxychloroquine, a study in The Lancet reported a higher risk of death among hospitalized COVID-19 patients taking the drug. Offord, a senior editor at The Scientist who reports on scientific integrity, among other topics, contacted the authors of the paper, which included the founder of Surgisphere Corporation, a data-analytics firm in Illinois.

Soon after the Lancet paper’s publication, concerns about statistical analysis and the integrity of a dataset that Surgisphere had provided emerged on social media and on PubPeer, a public forum where scientists comment on newly published research, Offord says. When Offord emailed the Surgisphere founder with these questions, she found his response to be too vague. “I asked him quite specific questions and he replied with this very odd PR-style spiel, not addressing the questions I had asked at all,” she says.

This experience led her to take a closer look at Surgisphere and its operation, and led to a series of stories at The Scientist. And following Offord’s work and reporting by other journalists, the controversial paper was soon retracted along with other high-profile studies. The Lancet study, Offord later wrote, was a “medical and political bombshell,” and its publication and subsequent retraction had an outsized impact beyond the realm of scientific publishing.

Retractions like the one Offord covered aren’t a new or rare phenomenon. They have become more frequent, says Ivan Oransky, co-founder of Retraction Watch, which reports on retractions of scientific papers. “In the year 2000, there were about 40 or 38 retractions,” he says. Last year, according to Retraction Watch’s records, there were more than 3,500.

When editors at scientific journals discover an error in published research, they withdraw the study. But often the retraction process can take a long time, and even when papers are retracted, the reasons for the withdrawal aren’t always spelled out, says Oransky. Other times, mistakes or shoddy work slip through the peer review process and persist unnoticed. That is where journalists have stepped in.

“When the scientific community is engaging in work that is fraudulent or improper and is just not self-correcting, it’s up to us as journalists to take on some of the responsibility to police the field,” says Charles Piller, an investigative journalist at Science, who covers violations in scientific and clinical research.

However, reporting on such misconduct, and retractions in general, can be challenging: Stories can be difficult to spot, sources can be reluctant to speak, and assembling the evidence is time-consuming. But there are ways around these hurdles. And for some reporters, bringing such transparency to science and publishing is worth the extra work. “I would hope that that leads to greater trust in science journalism overall,” says Offord.


Finding Patterns and Spotting Stories

Press releases announcing retractions or misconduct rarely arrive as leads in a reporter’s inbox. So, finding these stories can take some work.

One valuable strategy is to get the word out that you’re on the hunt for retraction stories. A short news story about a recent retraction can signal to sources that a reporter and publication are keen to follow that case. “These stories are really signposts that say, ‘I noticed this, I’m a journalist, I’m on it,’” says Richard Van Noorden, a features editor at Nature who has covered trends in scholarly publishing.

For example, when Offord published her article about the Lancet retraction quoting Surgisphere’s founder, several people who knew the company executive read the piece and contacted her, serving as sources for her next piece. “As soon as we published that story, including his name, people from his past got in touch with us,” she says.

Keeping your ear to the ground on social media is just as valuable. Dalmeet Singh Chawla, a freelance science journalist based in the U.K. who covers scholarly publishing, among other topics, follows a vocal community of scientists who are active on Twitter and have built a reputation for spotting errors—image duplications, statistical problems, problems with ethical approval in studies—and sharing them. Tracking those conversations can be a good jumping-off point for finding stories, Chawla says.

These scientist-sleuths, some of whom operate under pseudonyms, often know about recent retractions and have ideas for interesting cases that a journalist can dig into, says Van Noorden. So building a rapport and having an open-ended conversation with them could be useful.

Sayantan Datta, a Hyderabad-based science journalist, who tracked a high-profile retraction that involved an elite scientific institute in India last year, says they send papers they want to verify to experts such as Elisabeth Bik, science integrity consultant based in California, as well as to pseudonymous experts such as Cheshire on Twitter or Actinopolyspora Biskrensis, who is active on PubPeer.

Conferences and meetings such as the World Conferences on Research Integrity and the European Conference on Academic Integrity and Plagiarism can also be ideal places for developing sources, says Van Noorden. Chatter on public forums such as PubPeer can also be revealing. It can be especially helpful to keep track of which authors or papers keep coming under scrutiny, says Offord.

Similarly, websites that collate published scientific research (such as PubMed, Dimensions, Scopus, Web of Science, Google Scholar, OpenAlex, and Lens) can be good places to find broader context beyond one or two interesting retractions, allowing a reporter to track the work of an individual scientist, their collaborators, or their citations across multiple papers. Most of these databases are free to use and those that are behind subscription paywalls, such as Scopus and Web of Science, allow free access for journalists, Van Noorden notes. Notably, Retraction Watch hosts a searchable database of retracted scientific papers that contains several times as many retractions as other general research databases, Oransky says, and can help reporters track retractions from particular authors, subjects, and universities. By browsing such databases, Van Noorden says, “You can start to see those kinds of patterns, which can lead you to something bigger than the one or two things that you’ve already seen.”


Wrangling Sources: Trust but Verify

Scientists are often enthusiastic to talk about their own published studies or comment on their colleagues’ work. But they can be more hesitant to be named or quoted in more contentious stories.

To reach scientists who know the backstory behind a particular scientist or a retraction, Offord sometimes tells them about her objectives as a reporter, including why their information is important to the public. “A lot of it is building trust with people,” she says. Occasionally, she speaks to such sources on background, with the understanding that the information they provide won’t be directly attributed to them.

When information comes in from tipsters, it’s worth keeping in mind that those sources may have their own motivations for talking to a reporter, says Piller. So it’s necessary to vet such information independently. As Van Noorden says, “It’s trust, but verify.”

Under rare circumstances, it might be important to withhold the identity of a source who is providing essential but sensitive information. If speaking with a reporter puts their personal safety or employment in harm’s way or puts them in legal jeopardy, these “can be good reasons to anonymize identity,” says Van Noorden in an email. And while quoting or paraphrasing such sources, the reporter should mention why they are being anonymized without giving away details about their affiliation or other sensitive details, he says.

Sources whose work is being investigated should be given multiple opportunities to provide their side of the story. It’s a key principle of fair reporting, says Piller, and it gives such sources a chance to correct any facts that the reporting has brought up. “What’s important is to make it clear to them that your story is going to appear, and their choosing not to talk to you is not going to stop your story from appearing,” Van Noorden says.


Facts and Fairness

Because retraction stories and investigations generally involve multiple modes of verifying information, it helps to be meticulous about recording and organizing information. “I would definitely advise that [anyone reporting on retractions] get very, very good with the record-keeping,” says Datta.

In February of 2022, Piller published an investigation involving a Canadian botanist whose work was coming under new scrutiny. In 2013, the botanist had published a high-profile study suggesting that some herbal products sold in the North American market did not contain all the ingredients listed on their labels and were laced with contaminants. Eight years later, in June 2021, a group of researchers sent a 43-page letter to the University of Guelph, the botanist’s employer, stating that the data in that key work and two others were “missing, fraudulent, or plagiarized.”

The botanist denied all charges to the university, Piller reported, and the journal announced it would review the 2013 study.

Piller obtained a copy of the complaint letter and got to work. He reviewed thousands of pages of the botanist’s papers, public speeches, and slide decks, and documented problems that went beyond those papers.

For example, Piller reported that the Canadian botanist had plagiarized his lecture notes. To establish this, he obtained the botanist’s lecture notes through a commercial service online where students share their notes, then shared those notes with other faculty at the university, who confirmed that the lecture notes belonged to their colleague. Piller then ran the notes through a service that checked for plagiarism and found that the notes were derived from Wikipedia and elsewhere.

Verifying the documents through multiple sources was essential, says Piller, “both to the fairness of the exploration of that claim, and also to validating its authenticity.”

As part of doing their due diligence, reporters should contact the source and provide them an opportunity to comment in as fair and as direct a way as possible before a piece runs, says Van Noorden. Standards at most publications require that a reporter make every effort to allow the target of an investigation a fair chance to present their side of the story.

This can take the form of a “no-surprises” email that a reporter sends close to their publication deadline, stating all the reported facts that are relevant to the source as well as a list of questions that the source has not answered, and giving them a final opportunity to respond.

Despite the challenges and time involved in reporting on retractions or scientific misconduct, they hold potential for huge impact. For example, one of the stories in the Surgisphere series that Offord reported was cited in a joint World Health Organization/Pan American Health Organization guidance on ivermectin use, another unproven drug against COVID-19. In making retractions public knowledge, Offord hopes that readers come away with a more complete view of science, an enterprise that is led by scientists, “who are people, some of whom may have flaws.”


Pratik Pawar Courtesy of Pratik Pawar

Pratik Pawar is an independent science journalist who writes about global health, ecology, and science policy. A TON early-career fellow supported by the Burroughs Wellcome Fund, he lives in Bangalore, India. Pratik’s work has appeared in DiscoverScience News, The Wire, and Undark, among other publications. Follow him on Twitter @pratikmpawar.

Skip to content