Misinformation Before and After the Election

Share

With nearly a week until Election Day, experts on election-related misinformation and disinformation say that social media companies––including Meta, TikTok, YouTube, and X––continue to roll back previous commitments.

NORA BENAVIDEZ; for interviews, contact Tim Karr at tkarr@freepress.net 
    Benavidez is Senior Counsel and Director of Digital Justice and Civil Rights at Free Press. She leads Free Press’ democracy and tech initiatives. 

In April, Free Press analyzed 12 major tech companies’ readiness to address political disinformation, concluding that none of the 12 platforms were adequately responding to inaccurate election information. Free Press led a coalition of more than 200 groups urging such companies to implement interventions to protect platform integrity in the upcoming election; only a few companies responded, and none committed to increasing their numbers of trust and safety employees or content moderators. Several companies have also laid off content moderators in recent months. 

Benavidez told the Institute for Public Accuracy: “The number one thing on my mind right now is the anxiety that voters are feeling in the lead up to November 5 and what might happen in the days afterward. People’s information diets are so varied, but we know that most Americans get some information every day from social media. It’s important to understand what the last few years have looked like and why the disinformation landscape in 2024 is far worse than it was in 2020. That is because the cost of distributing and producing disinformation has reached nearly zero. Social media platforms have been retreating from their previous commitments to do even the most basic, low-hanging fruit.” 

Free Press has released two reports documenting how social media platforms have rolled back policies that keep hate, lies, and toxicity at bay. “Platforms have engaged in mass layoffs on critical teams like content moderation or trust and safety,” Benavidez said, “and folks like Elon Musk are forcing X users to see and engage with misleading and toxic content. There is more junk and manipulation out there. The biggest platforms have largely turned their back on content moderation and conveyed a real dereliction of duty. Meta has attempted to wash its hands by depoliticizing its content. YouTube has allowed lies about immigrants to spread, causing real-world harm. Instagram has admitted that its failures are due to ‘algorithmic glitches.’ This is a dangerous circumstance. But platforms have the know-how to moderate content, and they’ve done it in the past. 

“I worry that in the days surrounding November 5, people will be desperate for information and likely have their guard down. We need reporters and media professionals to help bring the temperature down for audiences right now. Our [election] institutions have integrity, and election officials are doing their jobs safely and accurately. The ‘Big Lie’ from 2020 catalyzed people’s fears and doubts that our electoral system is rigged, that officials might be corrupt and influencing the results. We’ve seen the fruits of that lie: election officials have received more targeting and threats than ever before. But the kind of capture that Donald Trump and his allies have put on the legal system has not worked, and it’s important for audiences to know that. That doesn’t mean we’re immune from attack––but it’s still very much worth voters getting out there and engaging in the democratic process. 

“The highest risk dates for the election include November 6 through December 11. During that period, states will be gathering, tabulating, and certifying results. During that time, reporting about preliminary or premature claims of a victor can spread like wildfire, causing people in one state to share news that the national election has been called. That’s the moment that people may be at their most concerned and we have to worry about real-world harm and violence. That  period in November through December 11 is very critical. 

“We saw that occur in 2020. A lot of the measures that platforms had in place––to ensure extremist content and false election claims were kept at a minimum––were turned off after the election, and that content surged.”