Science eLetter: Social media algorithms can curb misinformation, but do they?

September 27, 2024
social_media_image.jpg

Filippo Menczer co-authored an eLetter, questioning a study that suggests Facebook's algorithm is not contributing to political polarization on the platform. The study, “How do social media feed algorithms affect attitudes and behavior in an election campaign?” was published in Science in July 2023 and examined how the platform presented information to users around the 2020 U.S. elections.

Menczer was part of an interdisciplinary research team led by Przemek Grabowicz, then a research assistant professor at the University of Massachusetts Amherst. In their eLetter, the team argues that during the widely-reported study, Meta instituted a series of changes to its algorithm to reduce the spread of misinformation, potentially undermining the study’s findings.

The team said the original study had analyzed a short period of time in which Meta had temporarily introduced a more rigorous news algorithm, and did not factor in this change when they reported the algorithms were not major drivers of untrustworthy news, thus creating a widely-reported misperception.

The co-authors state that Meta introduced changes to Facebook’s news feed in November of 2020 to reduce the visibility of untrustworthy news after the 2020 U.S. presidential election. Those changes were successful, cutting user views of misinformation by at least 24 percent, but were not made permanent. Meta resumed using its standard algorithm starting in March of 2021.

The eLetter states the original study ran from Sept. 24 to Dec. 23 of 2020 and didn’t clarify it was conducted while Meta had a more stringent -- and temporary -- algorithm in place. The co-authors suggested the study created the incorrect impression that Facebook’s standard algorithm is good at stopping misinformation.

The team suggested Facebook can limit untrustworthy content, but social media platforms may not have the financial incentive to modify their algorithms in such a way. The implication – they put profit ahead of potential harm to the public and democracy.

The authors of the original study reject that argument and are standing by their findings, which they say are more limited and nuanced than often perceived. In an editorial, Science Editor-in-Chief Holden Thorp says the journal will not ask for corrections to the paper but will make readers aware of the critiques.