Covid hoaxes are using a loophole to stay alive, even after content is deleted
Pandemic conspiracy theorists are using
the Wayback Machine to promote “zombie content” that evades moderators and
fact-checkers.
by
April 30, 2020
CrowdTangle's results for the deleted News NT story available via the Wayback Machine
While the original page failed to spread
fake news, the
version of the page saved on the Internet Archive’s Wayback Machine absolutely
flourished on Facebook. With 649,000 interactions and 118,000 shares, the
Wayback Machine’s link achieved much greater engagement than legitimate press
outlets. Facebook has since placed a fact-check label over the link to the
Wayback Machine version too, but it had already been seen a huge number of
times.
There are several explanations for
this hidden
virality. Some people use the Internet Archive to evade blocking of
banned domains in their home country, but it is not simply about
censorship. Others are seeking to get around fact-checking and
algorithmic demotion of content.
Many of the Facebook shares are to
right-wing groups and pages in the US, as well as to groups and pages critical
of China in Pakistan and Southeast Asia. The most interactions on the News NT
Wayback Machine link comes from a public Facebook group, Trump for President
2020, which is administered by Brian Kolfage. He
is best known as the person behind the controversial nonprofit We Build the
Wall. Using the technique of keyword
squatting, this page has sought to capture those seeking to join
Facebook groups related to Trump. It now has nearly 240,000 members, and the
public group has changed its name several times— from “PRESIDENT DONALD TRUMP
[OFFICIAL]” to “President Donald Trump ✅ [OFFICIAL]” then “The Deplorable”s ✅”
and finally “Trump For President 2020.” By claiming to be Trump’s “official”
page and using an impostor check mark, groups like this can engender trust
among an already polarized public.
When looking for more evidence of hidden
virality, we searched for “web.archive.org” across platforms. Unsurprisingly,
Medium posts that were taken down for spreading health misinformation have
found new life through Wayback Machine links. One deleted
Medium story, “Covid-19 had us all fooled, but now we might have
finally found its secret,” violated Medium’s policies on misleading health
information. Before Medium’s takedown, the original post amassed 6,000
interactions and 1,200 shares on Facebook, but the archived version is vastly
more popular—1.6 million interactions, 310,000 shares, and still climbing. This
zombie content has better performance than most mainstream media news stories,
and yet it exists only as an archived record.
Data from Crowdtangle on the original
Medium post and on the archived version
Perhaps the most alarming element to a
researcher like me is that these harmful conspiracies permeate private pages
and groups on Facebook. This means researchers have access to less than 2% of
the interaction data, and that health misinformation circulates in spaces where
journalists, independent researchers, and public health advocates cannot assess
it or counterbalance these false claims with facts. Crucially, if it weren’t
for the Internet Archive’s records we would not be able to do this research on
deleted content in the first place, but these use cases suggest that the Internet
Archive will soon have to address how its service can be adapted to deal with
disinformation.
Hidden virality is growing in places where
WhatsApp is popular, because it’s easy to forward misinformation through
encrypted channels and evade content moderation. But when hidden virality
happens on Facebook with health misinformation, it is particularly
disconcerting. More
than 50% of Americans rely on Facebook for their news, and
still, after many years of concern and complaint, researchers have a very
limited window into the data. This means it’s nearly impossible to ethically
investigate how dangerous health misinformation is shared on private pages and
groups.
All this poses a different threat than
political or news misinformation, because people do quickly change their
behavior on the basis of medical recommendations.
Throughout the last decade of researching
platform politics, I have never witnessed such collateral damage to society
caused by unchecked abusive content spread across the web and social media.
Everyone interested in fostering the health of the population should strive to
hold social-media companies to account in this moment. As well, social-media
companies should create a protocol for strategic
amplification that defines successful recommendations and
healthy news feeds as those maximising respect, dignity, and productive social
values, while looking
to independent researchers and librarians to identify
authoritative content, especially when our lives are at stake.
Geen opmerkingen:
Een reactie posten
Opmerking: Alleen leden van deze blog kunnen een reactie posten.