According to a brand new study that was recently announced by the Universite Grenoble Alpes in France and the New York University, the spread of misinformation on the social media platform Facebook ends up getting six times more engagement compared to actual news.
For the study, researchers reviewed posts from the Facebook pages of over 2500 news publishers, between August 2020, and January this year. They ended up figuring out that the pages that more regularly share misinformation with their audiences tend to get more likes, comments, and shares. Additionally, the increase in engagement was similar across the entire political spectrum. However, according to the researchers, the publishers on the right wing tend to have a higher likelihood to share misinformation in the first place, compared to other publishers on the political spectrum.
Although this new peer-reviewed study is to be shared as part of the 2021 Internet Measurement Conference that’s going to be taking place this November, it might be released earlier, according to one of the researchers involved in it.
In response to the findings of the study, a spokesperson from Facebook pointed out that the research was only looking at the engagement segment of the posts, without taking their “reach” into account. The reach of a post on the social media platform details the number of people who end up seeing a piece of content regardless of whether or not they end up interacting with it. However, this type of information is only available to the owners of the Facebook pages, which means it wasn’t available to the researchers involved in the study.
Instead of asking Facebook for that information, which isn’t publicly shared, many researchers, including the ones that conducted the study, prefer turning to other tools, such as CrowdTangle, when looking to understand or quantify the platform’s problem with misinformation. CrowdTangle is a tool that’s also owned by Facebook, but a few weeks ago, the company decided to cut off the researchers’ access to it, along with the company’s library of political ads.
According to Facebook, this decision came because the company would end up violating one of its settlements with the Federal Trade Commission (FTC); a settlement which was made after the platform’s scandal with Cambridge Analytica. However, in a statement, the FTC said that this justification for banning the researchers was inaccurate and that the commission doesn’t actually stop the platform from creating exceptions to the third-party access to data when it’s in relation to research that benefits the public’s interest.
Recently, a journalist used the same Facebook-owned tool, CrowdTangle, to create a list of posts that generally got the most amount of engagement on the platform. The tech columnist Kevin Roose shared the list of posts internally with the New York Times. He ended up finding that most of the time when creating these types of lists, they were largely dominated by right-leaning pages that ended up sharing a lot of misinformation.
Facebook has been battling the spread of misinformation for a while now, and decided to share a report on transparency last month. The company shared a list of its most-viewed posts from the second quarter of 2021. However, soon after, the New York Times shared that the company had scrapped its plans to share the report about the first quarter of 2021 because the top viewed post was an article filled with vaccine misinformation.