Iffy.news: An index of unreliable sources

Iffy.news

When the conversation turns to estimating how many online sites there are that deliver news articles with made-up facts, out-of-context assertions and purposefully misleading information, also known as dis- and mis-information, most folks shrug and say: “Too many.”

But that wasn’t good enough for former RJI fellow Barrett Golding, who was aiming for a more exact count. That pursuit led to the creation of his “Iffy Index” and the list of, at last count, the 400 online sites that feature unreliable news and information.

He’s officially launching the index today and hopes it will be a helpful tool for researchers and journalism scholars who are trying to untangle the convoluted world of news that is designed to mislead and confuse.

To learn more about the inspirations and aspirations for the Iffy Index, RJI asked Barrett a few questions:

So there are 400 sites in the Iffy Index, which seems like a lot. Did that number surprise you? Do you think you’ve captured most of them?

My guess: At any one time, about 600 fake-news sites are making millions in ad dollars by misleading the public. The index doesn’t capture them all, but it probably has about 95% of the most active and profitable,

Iffy.news only includes sites with low or very-low Media Bias/Fact Check factual-reporting levels. That MBFC rating is based on fact-checks by International Fact-Checking Network signatories. Everything’s transparent and documented: The index is a curated list of unreliable sources. Each source links to its MBFC review and to fact-checks of its stories (via Iffy’s custom Fact-check Search tool).

However, THE INDEX IGNORES BIAS. Did I say that loud enough? Political bias is not a factor in whether the Iffy.news Index of Unreliable Sources includes a site. The only consideration is whether the site fails fact-checks, whether it repeatedly wastes fact-checkers’ time by having to refute its false claims.

What are your plans for keeping the site updated, both in adding new sites and deleting defunct ones?

Every month I (programmatically) pull in the most recent MBFC ratings then (manually, for now) update the database with any new or changed records. I also check that against other credibility metrics-related ratings (especially NewsGuard’s trust ratings). The different ratings agencies mostly agree. But I’ve delisted sites in the few cases they don’t.

What’s your relationship with Media Bias/Fact Check?

Quite cordial: Owner Dave Van Zandt gave me permission to harvest his ratings. I try to return the favor by sending him information MBFC might find useful, such as sites not in MBFC that others have rated.

For my purposes, MBFC has the best data. They’ve been at this longer (3,000 site-reviews since 2015) and have a longer list of low-quality sites than anyone else.

And MBFC shows their work. Their reviews are detailed and public, not behind a paywall. The trust-ratings biz is stuffed with startups. Most keep their data secret or subscription-only. I come from the open-source world where data-sharing, cross-checking, and cross-pollinated methodology mashups make everyone’s business better.

THE INDEX IGNORES BIAS. Did I say that loud enough? Political bias is not a factor in whether the Iffy.news Index of Unreliable Sources includes a site. The only consideration is whether the site fails fact-checks, whether it repeatedly wastes fact-checkers’ time by having to refute its false claims.

You’ve said that the Iffy Index is a helpful tool specifically for journalism scholars and researchers. How do you anticipate they’ll get the most use out of it?

Mis/disinfo researchers rely on fake-news lists for their studies. But those lists are flawed. They’re undocumented and outdated, full of old 404s and missing many new, active mis/disinfo spreaders.

For my own research projects, I needed a more reliable dataset of unreliable domains. I didn’t need all the unreliables but I did need the index to be, with a high degree of certainty, only unreliables.

Also, we judge news credibility, in part, by transparency. I felt news research should be similarly judged. So I wanted the index to transparently document the reasons for each site’s low rating.

Thus was born the Iffy.news. If you, dear reader, are training your AI algos to spot fake-news or calculating fake-to-fact ratios in social media shares, Iffy might be your huckleberry.

Have you been able to spot any similarities between sites that feature unreliable news stories? Are there clues that might help identify the iffy sites from the reliable ones?

The culprits peddling Covid-19 cures and conspiracy theories are easy to spot. But most fake-news sites look like legit news — the same layout, sometimes the same ads. Many mix real news with the fake (or perhaps they’ve forgotten the difference).

You can’t tell at a glance. (I complained in this very venue how newspapers could but don’t distinguish their websites from fake newsies.) In general, though, if many headlines are alarmist and accusatory, and few are informational, you’d best get your news elsewhere.

This gif shows some examples of unreliable news sites that are featured in the Iffy News index.
This gif shows some examples of unreliable news sites that are featured in the Iffy News index.

Do you think the index offers any benefits for the news industry and, specifically, for the many consumers/viewers of journalism who are trying to protect themselves from sharing disinformation and unreliable information?

Not yet, I have to admit. It’s made for news researchers. Down the road, though, this may help news writers and readers slow the spread of mis/disinfo.

The fake-news infestation is fed and sheltered by adtech dollars. High-tech companies are trying to algorithmically contain the outbreak with blockchains and neural nets.

My approach is lower-tech: Gather existing data. Share research. Build publicly available open-source tools.

Together we may be able to build a dynamic, accurate, up-to-date bad-site blocklist. One used by advertisers for brand safety, by platforms to quit being super-spreaders, and by people to fight fake-news. I firmly believe we can flatten the infodemic curve by collaborating.

(Disclosure: I’m trained neither as an researcher nor statistician. My career was in public radio and web production. But during my year as an RJI fellow I’m pretty sure I was brainwashed. Ever since then, I’ve been obsessed with the state and fate of journalism. Future fellows beware.)

Some folks say disinformation is in the eye of the beholder and, judging from some of the selected comments on your site, you’ve heard from them in one form or another. What’s your take on how to evaluate a news site’s credibility?

We all have a built-in B.S. detector. The more it’s engaged, the better its accuracy. Before sharing an article, or when a news story seems suspect, do a casual fact-check. It usually takes only a couple minutes. (Two minutes too long? Then try Mike Caufield’s 30-second fact-check.)

Think of fake-news like a virus: Go the extra social distance to make sure you don’t infect anybody. Be extra skeptical of articles that confirm your beliefs/biases: That’s how the clickbait-ers hook you.

Check the source. Who wrote it? Where’d they get their info — the primary source? Does the primary source’s info match what’s in the article? Google the headline: What other outlets are carrying the story? Are those outlets legit?

Iffy’s Fact-check Search looks through 20 fact-checking sites for mentions of a topic or publisher. And I’ll soon post a tool that quickly checks a news source in Wikipedia (thanks to a WikiCred microgrant).

Barrett Golding is an audio/web producer. He was Executive Producer of Hearing Voices from NPR, a Peabody Award-winning weekly hour. He is a WordPress.org contributor, was 2015–2016 Fellow at the Reynolds Journalism Institute, 2010 United States Artists: Media Fellow, and consults on the Teaching Hard History podcast (Southern Poverty Law Center).

Comments

Comments are closed.