Illustration from L.M. Glackens’ The Yellow Press (1910) via The Public Domain Review

Be Laura Hazard Owen, for NiemanLab

The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

What’s coming. Nieman Lab had 200 people in and around media to make predictions for 2019. Some misinformation-related highlights are below — along with fake-news related predictions and preparations for 2019 from other sources, so consider this a look at the year ahead. (Warning: People do not agree about anything.)

Panic about deepfakes. Or don’t! Deepfakes — realistic videos created with AI software — have “frightening implications for journalism,” Rubina Madan Fillion, director of audience engagement at The Intercept, writes, and “while there are tricks and algorithms to help detect deepfakes, creators have already found ways to counteract them.”

Except — quit it with the deepfake fearmongering headlines, First Draft’s Claire Wardle writes — what she’s really worried about instead is the hyperpartisan memes shared among family and friends in closed messaging spaces like Facebook groups, WhatsApp, Snapchat, and Instagram Stories. “As we spend more time in these types of spaces online, inhabited by our closest friends and family, I believe we’re even more susceptible to these emotive, disproportionately visual messages. (Goes without saying we need much more research on this issue so we have a greater understanding of the impact of messages that travel between trusted connections.)”

Think outside the U.S. We’ll remember that misinformation isn’t solely a Western problem, writes Africa Check’s Peter Cunliffe-Jones. His focus is on the Global South. “2019 will see landmark elections in two of Africa’s most important countries, perhaps the only ones that get any consistent attention from the outside world: Nigeria and South Africa. Coming after the way misinformation was seen to affect elections in Brazil this year, that’s one reason I think the geographic focus on the impact of misinformation will shift in 2019.” (As Tshepo Tshabalala notes, general elections are taking place in more than a dozen African countries this yearas well as in India and Indonesia.)

Cunliffe-Jones adds that “in complex tinderbox societies, the potential for mis- and disinformation to sow not just social discord but real violence is very clear,” and some of the tactics attempted in the West — like “publishing evidence-based reports and leaving things there” — won’t be enough. Moreno Cruz Osório has some ideas for Brazil here.

The death of consensus. “In 2019, let the idea that we’re seeing the death of truth die,” writes An Xiao Mina. “What looks like the death of truth is actually the death of consensus, and a broader transition to a world of dissensus nudged along by a wide variety of media outlets online, on television and radio, and in other forms of media. Misinformation spreads most effectively in this environment because someone, somewhere will find information that fits an existing worldview, and it’s that deeper worldview that’s much harder to change.”In a climate where consensus can’t be taken for granted, journalism will have to change too. Let’s be okay with uncertainty, Alberto Cairo writes. “In 2019, we’ll all learn to be less certain about our beliefs. We may even pay attention to cognitive psychologists who explain that the best way to become aware of our knowledge gaps is to try to explain our opinions to others without taking logical leaps or relying on arguments from authority. We’ll be humbled by our many failures at these attempts.”

And speaking of hackneyed topics like the “death of truth,” Mike Caulfield has another idea he wishes we’d retire: The cynical notion that media literacy is pointless.

Of course, perspectives shift. Once a person subscribes to a page or channel, what Claire Wardle calls the drip, drip, drip of radical content begins to wear at one’s worldview. But this process so often seems to begin through a series of small mistakes, little neglects that eventually lead to more permanent results. In reality, many forms of both radicalization and infiltration would be more difficult with a media literate audience — particularly if those with the most influence had better skills and habits around assessing reputation and intent.

Reporting better. Hey, influencer: Want to build those “skills and habits around assessing reputation and intent”? First Draft’s Wardle has five lessons for reporting in an age of disinformation, listing the skills she’d like to see newsrooms help their reporter develop in 2019. Among them: Don’t give disinformation additional oxygen.

Our work suggests that there is a tipping point when it comes to reporting on disinformation. Reporting too early gives unnecessary oxygen to rumors or misleading content that might otherwise fade away. Reporting too late means the falsehood takes hold and there’s really nothing to do to stop it (it becomes a zombie rumor — those that just won’t die).

There is no one tipping point. The tipping point differs by country but is measured when content moves out of a niche community, starts moving at velocity on one platform or crosses onto other platforms. The more time you spend monitoring disinformation, the more the tipping point becomes clearer, which is another reason for newsrooms to take disinformation seriously. It is also a reason to create informal collaborations so newsrooms can compare concerns about coverage decisions. Too often newsrooms report on rumors or campaigns, for fear that they will be “scooped” by other newsrooms, when again, this is exactly what the agents of disinformation are hoping for. Having every newsroom publish a QAnon explainer back in August after people turned up at Trump rallies with Q signs and t-shirts was exactly what the Q community had hoped would happen.

Plus two looks back. It is not possible to debunk everything. Africa Check deputy editor Lee Mwiti wrote about 10 claims that the team tried — unsuccessfully — to fact-check. Often, the data simply wasn’t available. It’s refreshing to see a list of failures, from a team that is otherwise doing lots of good work, and a good reminder that fact-checking isn’t always straightforward.

And here are BuzzFeed’s Craig Silverman and Scott Pham with the 50 biggest fake news hits on Facebook in 2018. “In spite of a prediction from Facebook’s top anti-misinformation product manager that these articles would see a decline in engagement in 2018,” they write, “this year’s top-performing hoaxes generated almost as many shares, reactions, and comments as last year’s” — a combined 22 million shares, reactions, and comments.

Be Laura Hazard Owen, for NiemanLab