Far-right misinformation received highest engagement on Facebook
Content posted from news outlets rated as far-right received the highest levels of engagement on Facebook in the months surrounding the 2020 elections, according to a new study. Moreover, researchers found that among far-right outlets, sources identified as spreading misinformation had on average 65 percent more engagement per follower than other far-right pages, according to the study released by New York University’s Cybersecurity for Democracy on Wednesday.
The study evaluated a total of 8.6 million Facebook and Instagram posts between Aug. 10 and Jan. 11 downloaded from the tool CrowdTangle. Researchers used lists of U.S. news sources and their Facebook pages from two independent data providers that rate the political leaning and quality of media and identified 2,973 news and information sources.
Throughout the six-month span, content from far-right sources consistently garnered the highest level of engagement compared to sources identified as coming from other political leanings, based on the study.
The study comes as Facebook and other tech giants are under increased scrutiny over their handling of misinformation.
Facebook CEO Mark Zuckerberg is scheduled to appear alongside other top tech CEOs before the House Energy and Commerce Committee later this month.
Democrats have criticized the platform’s content moderation policies, accusing the social media giant of not taking a strong enough approach to tackling misinformation and hate speech.
Republicans, however, have leveled unsubstantiated accusations that the tech giant is censoring content with an anti-conservative bias. NYU’s study further discredits the claims through the findings that far-right pages received the highest levels of engagement. A report released by NYU’s Stern Center for Business and Human Rights’ last month also concluded the anti-conservative bias claims are not based on any evidence.
Far-left sources were a distant second in earned engagement, according to the study, even on days where engagement peaked for the more politically “extreme” outlets, such as on Election Day or Jan. 6.
For example, the study found both far-right and far-left outlets saw a boom in interactions on those dates, but far-right pages saw more than 450 interactions per thousand followers while far-left pages saw just under 250 interactions.
The increase in engagement on the two key dates was “much less intense” for other news sources, compared to the more politically extreme outlets, based on the data.
The study also found that far-right sources did not suffer what researchers deemed a “misinformation penalty,” meaning sources of misinformation from far-right outlets outperformed far-right pages that were not identified as sources of misinformation. Researchers define a misinformation penalty as "a measurable decline in engagement for news sources that are unreliable."
Far-right sources of misinformation had 426 interactions per thousand followers per week, compared to the 259 weekly interactions of the far-right pages that were not identified as misinformation sources, based on the study.
But among all other partisan categories the “misinformation penalty” was at play, with sources spreading misinformation receiving at least slightly fewer interactions than those that did not.
For example, far-left sources not identified as spreading misinformation had more than 140 weekly interactions per thousand followers, while far-left sources identified as spreading misinformation only earned 60 weekly interactions.
The so-called penalty was the smallest among “slightly right” sources, but it was still at play with the misinformation sources reaching nearly 120 weekly interactions per thousand followers compared to the roughly 130 interactions of the non-misinformation sources.
The researchers acknowledged that its findings were limited based on the lack of data provided by Facebook. They were limited to information on engagements and not on how many users saw content.
Last month, Facebook said it would be piloting ways to reduce the amount of political content users see.
For over a year, Anthony Fauci has been a bogeyman for conservatives, who have questioned his handling of the Covid-19 pandemic and accused him of quietly undermining then-President Donald Trump. But those attacks took on a whole new level of vitriol this week, to the point that one social media analysis described it as highly misleading and at least one platform pulled down some posts, citing false content. It all stemmed from a tranche of Fauci’s emails that were published as part of a Freedom of Information Act request filed by various news outlets. Within hours of publication, the hashtag #FauciLeaks was trending on Twitter.
Donald Trump has appeared to drop his strongest hint yet at another presidential run in 2024, responding to news of his two-year ban from Facebook on Friday by saying he would not invite Mark Zuckerberg to dinner “next time I’m in the White House”.
Facebook is suspending Donald Trump’s account for two years, the company has announced in a highly anticipated decision that follows months of debate over the former president’s future on social media.
The image captured the raw humanity of the moment: a Red Cross volunteer tenderly consoling a Senegalese man moments after he stepped foot in Spain’s north African enclave of Ceuta. Hours after the footage went viral, however, Luna Reyes set her social media accounts to private after she was targeted by a torrent of abuse from supporters of Spain’s far-right Vox party and others incensed by the unprecedented arrival of 8,000 migrants in Ceuta.