Treasure Hunt: How to Find and Vet Journal Articles

  Léelo en español

A long row of library shelves with a network of light blue lines superimposed on the scene, connecting the books.


A few weeks into his first science-writing internship, Joshua Sokol hit the science-journalism jackpot: a newly published physics paper with a fun and surprising claim. Sokol, sensing the paper’s potential, set to work writing up a short news story about the paper. To get outside comment, he called a source in the field who was skeptical of the paper’s claims. Then, after requesting to go off the record, the source gave Sokol some background on the journal in which the paper was published: It wasn’t high quality, she said, and she listed other more highly regarded journals in her subfield. She said she had frequently wondered how the journal’s peer review process worked, because less rigorous papers often ended up published there. “Her opinions of the research itself, and the context about the journal, convinced me to go back to my editor and say it’s no good,” Sokol says.

As Sokol’s source suggested, not all academic journals are created equal. Some titles, such as Science, Nature, and JAMA, are well-known by scientists and journalists alike. But with tens of thousands of journals out there, journalists are bound to come across titles they’re unfamiliar with. In many cases, those journals contain hidden treasure: rigorous and interesting papers unlikely to receive the same coverage as those published in higher-profile journals. Veteran reporters and editors have developed effective strategies for finding those journals, keeping up with new papers, and accessing articles’ full text. Sometimes, as Sokol found, it can be tricky to determine the credibility of an academic journal. But certain metrics and good old-fashioned reporting can help point a journalist toward reputable research—and great stories.


Digging for Gold: Getting Started

With so many journals out there, it can be hard to know where to begin. For starters, journalists can sign up to receive emails from journals they know and trust. Some journals, especially larger ones like the Proceedings of the National Academy of Sciences and Nature Communications, send weekly or daily emails with details on new papers. You can also sign up for AAAS’s EurekAlert! service, which sends emails announcing embargoed papers, as well as a curated daily list of newly published papers.

But wading through those messages can be overwhelming, especially if you subscribe to lists that send notifications daily. (I currently have close to 4,000 such emails sitting unread in my inbox.) Plus, many journalists are seeing the same content, which means you might have competition in pitching stories. Finding papers published in smaller journals—and not sent to thousands of other journalists—can yield gems.

To find publications off the beaten path, start with keywords or topics that interest you. If you’ve reported related stories, go back to any papers you used in your reporting; the journals they’re published in could be a great starting point. You can also search Google Scholar for keywords based on your beat or coverage area, and then sign up to receive alerts. Luisa Massarani, a science-communication researcher who coordinates SciDev.Net’s Latin America edition, says her keywords include “disease,” “biodiversity,” and the names of Latin American countries she covers.

Once you’ve found some enticing-looking journals, think about how you will keep tabs on them.

Sources can also help you get oriented. Ivan Oransky, co-founder of Retraction Watch and editor-in-chief of Spectrum, recommends asking researchers where they publish their work and which journals they keep tabs on. If you’re new to a beat and don’t have established sources, try lurking on Twitter. Shreya Dasgupta, an independent science journalist based in Bangalore, India, recommends checking the timelines of researchers you follow to see what publications they’re talking about. “You’ll see some chatter, and that’s how you come across some papers,” Dasgupta says. Those papers could lead you toward journals those researchers read and submit their work to.

Once you’ve found some enticing-looking journals, think about how you will keep tabs on them. Journals vary widely in how easy they are to follow; while many allow anyone to sign up to receive tables of contents or individual articles by email, others do not. In those cases, check to see whether the journal has an RSS feed. If so, you can use a service like IFTTT (short for “If This Then That”) to set up automatic email alerts for new articles. If not, you may consider using services like Visual Ping to alert you when the website has been updated. Or, if you’re more the analog type, you can always set up a recurring calendar reminder to check the journal’s website regularly.


Appraising Your Finds

Once you’ve found journals you’d like to follow, there are a few metrics to check which can serve as a quick and dirty assessment that a journal is legitimate. Dalmeet Singh Chawla, a freelance journalist based in London, says he always first checks who publishes the journal in question. Well-known scholarly publishing companies like Elsevier, Taylor and Francis, Sage, Springer Nature, Wiley, and PLOS maintain huge numbers of journals that are all required to follow a code of ethics and the peer review process, where other scientists read and vet the work before publication.

That may sound like a low bar to clear, but within the academic community, there are now thousands of predatory journals—outfits that charge researchers fees to publish their work with scant peer review, or none at all. While these journals used to be obvious scams, with poorly designed websites and nonsensical papers, “predatory journals and publishers are upping their game and starting to look more legitimate,” says Chawla.

Once you’re reasonably sure the journal isn’t predatory, you might be interested in its relative quality: Is it well-regarded in the scientific community, or an obscure title?

If he doesn’t recognize the publisher, Chawla says, he consults established lists of predatory journals. Perhaps the best known is Beall’s List; creator Jeffrey Beall shut down his blog listing suspected predatory journals in 2017, but archived versions still exist online. Beall’s List—and others like it—are controversial, because researchers don’t all agree on the policies that define a predatory journal, and many journals’ policies fall in a gray zone, making them hard to characterize. Still, journalists might consider these lists as a starting point for their investigations. Cabells Predatory Reports and Journalytics also maintains its own database of deceptive and reputable journals; it is accessible only through a paid subscription, at a price point typically feasible only for university libraries.

You may also check whether the journal in question is included in major indexing services like PubMed, MedLine, Web of Science, or SCOPUS. Journals must meet some baseline standards to be included in these databases, like having a peer review process and copyright policies in place. “If [the journal] is there, it doesn’t prove it’s a great journal—but if it’s not there, I would wonder why it isn’t,” says Oransky.

Once you’re reasonably sure the journal isn’t predatory, you might be interested in its relative quality: Is it well-regarded in the scientific community, or an obscure title? Freelancers, in particular, are often encouraged to look for stories by reading smaller journals, since staffers are likely to cover the latest splashy paper published in Science or Nature, but finding journals that are less popular but still high quality can be tricky.

One hotly debated metric called impact factor can give you a rough estimate of how popular or well-known a journal is. You might see this metric posted on the homepage of a journal; it is a figure reported by a company called Clarivate, which calculates each impact factor using the number of times a journal’s papers have been cited by other researchers in the last two years divided by the number of papers the journal published over that same time period. Popular general science journals tend to have high impact factors; for instance, the New England Journal of Medicine’s 2020 impact factor was 74.69, meaning each article in the journal was cited an average of around 75 times. There’s a huge range in these figures; of the 15,000 journals Clarivate scored in 2020, more than 60 percent were assigned an impact factor of 2 or less, while just 2 percent received an impact factor above 10. That means if you’ve found a journal with an impact factor of 3 or higher, it’s likely to be at least moderately influential within its particular subfield.

But a low impact factor does not mean the journal is disreputable. Dasgupta says she used to be skeptical of lesser-known journals with lower impact factors. But then, she learned of studies showing that researchers in low- and middle-income countries often find it difficult to be published in high-impact journals. “If you don’t have a Western scientist attached to the paper, it’s hard, in this part of the world, to get published,” says Dasgupta.

Journals in languages other than English may not be given an impact factor rating at all, and may be ranked in other ways. For instance, Massarani, who lives in Rio de Janeiro, says that the Brazilian government has developed its own ranking system for the nation’s journals, through a program called QUALIS. Such rankings, like impact factors, are not perfect, but these types of metrics can signal that a journal is known within the research community.

If you want to know more about a journal’s reputation, it’s time to pick up the phone or send a quick email to an expert. As Sokol’s source showed, researchers have their own opinions about journals’ reputations, and their input can shed light on how a journal is regarded by experts. “There’s this informal, cultural knowledge and biases inside a field that just weren’t obvious at all to me,” says Sokol. Bothina Osama, editor of SciDev.Net’s Middle East & North Africa edition, says that she often seeks perspectives from her network of trusted sources. “When there’s a new journal we don’t know about, or it doesn’t have an impact factor, we look to our intrepid network of researchers,” she says.


Accessing and Assessing Paywalled Studies

So, you’ve found what appears to be a legitimate journal—and perhaps some intriguing scientific papers have caught your eye. Though the number of open-access journals is growing, you might find that the papers you want to read are locked behind paywalls. University libraries pay millions for annual journal subscriptions, and publishers charge users between $15 and $40 for a single paper. If you lack library access and can’t afford to shell out hundreds of dollars for paper access, there are a few options to try to gain access to papers’ full text.

First, search for publicly available versions of the paper. Google Scholar search results sometimes include a link to a PDF posted on a researcher’s website or on a platform like ResearchGate or Chawla also recommends a browser plug-in called Unpaywall, which identifies freely available copies of journal articles.

After vetting a journal, accessing papers, and reaching a study’s authors, the meatiest work of a journalist begins: digging into researchers’ findings and conclusions.

If you can’t find a publicly available copy, check whether you have access to the journals through journalism organizations you belong to. Members of professional groups such as the Association of Health Care Journalists and National Association of Science Writers have access to some publishers’ journals and databases. Some publishers, like Elsevier and Springer Nature, allow journalists to apply for access to their papers, and services like EurekAlert! also provide full-text PDFs of new articles to journalists.

Journalists with access to papers through their employer or through a university library affiliation might also be willing to share articles on a one-off basis.

Some tools that journalists and others commonly use to access paywalled journal articles are controversial because they can be considered to be enabling copyright infringement. One is the use of the Twitter hashtag #ICanHazPDF, a strategy that relies on the kindness of strangers who might be willing to fulfill an article request. The other is Sci-Hub, a website that posts copies of paywalled articles for free. (Its creator is being sued for copyright infringement by multiple journal publishers.)

Finally, you can reach out to study authors directly to request their papers. Most papers list a corresponding author, and in Dasgupta’s experience, emailing that author has yielded a quick response. She says she’s taken to emailing authors requests for their paper even if she already has found access to the paper elsewhere, because it serves as a starting point for her reporting. “If they respond, I can try to convince them to talk to me about the study as well,” she says.

After vetting a journal, accessing papers, and reaching a study’s authors, the meatiest work of a journalist begins: digging into researchers’ findings and conclusions. And sometimes, after all these early steps, you’ll discover major flaws with the research you’ve found, even if it’s published in an excellent journal, and the researchers seem competent. “Every journal publishes low-tier work even if they are not considered a low-tier journal,” says Oransky. “At the end of the day you really need to judge each paper based on what’s in the paper, rather than what’s in the journal.”

For instance, Osama recalls commissioning a piece at the beginning of the pandemic based on a paper she saw in a prestigious journal. For outside comment, the assigned reporter called a source, who eviscerated the paper. “He said this paper doesn’t deserve to be published—that it was not worth the ink it was written in,” says Osama. After that call, Osama re-examined the paper and agreed the source’s criticisms had merit. Ultimately, she and the writer decided to kill the piece.

At the end of the day, there’s no substitute for solid reporting, even if your initial research makes a new paper look promising. “Just because it’s in a journal doesn’t mean it’s going to hold up,” says Oransky. “Trust but verify.”



Jane C. Hu
Jane C. Hu Courtesy of Jane C. Hu

Jane C. Hu is an independent journalist, a regular contributor to Slate’s Future Tense, and a member of The Open Notebook’s board of directors. Her work often focuses on the intersection of science, culture, and technology, and her writing has recently appeared in publications like Undark, High Country News, Science, and Outside. Find her on Twitter @jane_c_hu.

Skip to content