According to a new study, misinformation on Facebook receives far more engagement than news
According to The Washington Post, a new peer-reviewed study from researchers at New York University and the Université Grenoble Alpes in France will show that misinformation received six times as much engagement on Facebook as true news.
Between August 2020 and January 2021, the study looked at posts from more than 2,500 news publishers' Facebook pages. Researchers discovered that pages that post more misinformation receive more likes, shares, and comments on a regular basis. According to The Washington Post, this increased engagement was seen across the political spectrum, but the study found that “right-wing publishers have a much higher propensity to share misleading information than publishers in other political categories.”
The researchers will share the study as part of the 2021 Internet Measurement Conference in November.
According to the Post, the study only looks at engagement, not “reach,” which is the term Facebook uses to describe how many people see a piece of content on Facebook regardless of whether or not they interact with it.
However, Facebook does not make data on reach available to researchers. Instead, they and others — including these researchers — have turned to a tool called CrowdTangle, which is owned by Facebook, to understand and quantify the social media platform's misinformation problem.
However, Facebook revoked this group of researchers' access to the data in August (as well as to the library of political ads on the platform). Facebook claimed that continuing to give third-party researchers access to the data would be a violation of a settlement it reached with the Federal Trade Commission following the Cambridge Analytica scandal — a claim the FTC dismissed as “inaccurate.”
CrowdTangle is the tool that New York Times tech columnist Kevin Roose used to compile regular lists of the most popular Facebook posts — a practice that reportedly irritated top executives at Facebook because the lists were frequently dominated by right-wing pages that spread a lot of misinformation.
In August, Facebook released a "transparency report" that detailed the most-viewed posts on the platform during the second quarter of the year, from April to June, in an effort to refute claims that misinformation is a problem on the platform. However, The New York Times reported just days later that Facebook had scrapped plans to release a report on the first quarter because the most-viewed post between January and March was an article that incorrectly linked the coronavirus vaccine to the death of a Florida doctor — a post that was widely used by right-wing pages to cast doubt on the vaccines' efficacy.