Rights Groups urg Meta, Google, TikTok, and X to Protect Voters from Disinformation and Hate Speech

An election year brings with it an influx of worries about misinformation, hate speech, and online violence. With more than half of the world's population scheduled to hit the polls across 65 countries, rights groups are concerned that tech companies aren't doing enough to protect voters.

Last month, a global coalition of 160 rights groups across 55 countries joined forces to urge major tech companies like Meta (Facebook and Instagram), Google (YouTube), TikTok, and X (formerly Twitter) to safeguard voters and elections. The coalition feels that the companies have not done enough to prevent the spread of disinformation, hate speech, and influence operations that undermine democratic processes.

The groups say that the companies have reduced investments in content moderation and restricted data access while continuing to profit from hate-filled ads and disinformation. They are concerned that tech companies are underinvesting in content moderation in non-English languages, leading to dangerous effects. This concern was highlighted in the Indian national election, which began this week, where anti-Muslim hate speech has led to an increase in communal violence.

Last year, a joint investigation from the Legal Resources Centre and Global Witness found that Facebook, TikTok, and YouTube had allowed 10 non-English ads to be published, even though they violated the company's own policies on hate speech. These ads were not in English, and the companies are yet to address the problem or create solutions for better moderation.

The issue extends to other countries holding elections in the next year. In South Africa, where the polls are scheduled for next month, online xenophobia has been manifested in real-life violence towards migrant workers, asylum seekers, and refugees. Observers note that social media platforms have not done enough to curb the problem.

The coalition has asked the tech companies to establish transparent, country-specific plans for election safety, including data on content moderation and details on the actions taken to prevent the spread of hate speech and disinformation. They say the plans should not be vague and should focus on each country and its specific threats and language needs. The groups also want the companies to clarify how many content moderators work in each country and what languages and dialects they speak.

When reached for comment, TikTok and Google pointed to their public statements on election integrity, as well as some country-specific plans. Meta noted that they have provided extensive information on their preparations for elections in major countries worldwide. X did not respond to requests for comment.

The groups are also concerned that tech platforms are making things worse. Meta, Twitter, and YouTube have collectively removed 17 policies aimed at safeguarding against hate speech and disinformation, according to the non-profit media watchdog Free Press. All three companies have had recent layoffs, with content moderation and trust and safety teams affected. Meta's decision to shut down CrowdTangle, an analytics tool used by journalists and researchers to track misinformation and viral content, has also concerned civil society organizations.

The threat of AI-generated disinformation is also high on the list of concerns. Political deepfakes have already been seen in countries like Slovakia and Pakistan. With the U.S. presidential election just three months away, it's likely we'll see more of these dangerous manipulations.

The lack of preparedness from tech companies is not a new issue. Just last week, a coalition of civil society organizations, researchers, and journalists sent a letter to social media platforms asking them to take swift action to combat AI-driven disinformation and reinforce content moderation, civil-society oversight tools, and other election integrity policies. Until these platforms listen to the growing chorus of concerned calls, it's unlikely we've heard the last of this issue.