Tech Platforms Are Not Doing Enough to Stop Election Violence

Facebook, Google, Twitter, and other social media platforms have played a significant role in spreading misinformation and fueling violence in the past. With the next U.S. presidential election approaching, concerns about online election-related violence are once again at the forefront. These platforms have a duty to ensure that they are not used to incite violence or spread misinformation, but are they doing enough?

The Atlantic reached out to 13 tech companies, including Meta, Google, and TikTok, asking how they were preparing for potential violence around the election. Only seven companies responded, providing minimal information and pointing to their community guidelines. Some highlighted the investments they have made in ongoing content moderation efforts. However, content moderation is just one piece of the puzzle.

  1. Enforce existing content-moderation policies: Platforms should prioritize enforcing their existing content-moderation policies consistently and effectively. They must ensure that their guidelines are transparent and that all users, including politicians, are held accountable for violating them.
  2. Add more moderation resources: Platforms should invest in expanding their content-moderation teams and bringing them in-house. They should also develop more sophisticated automated moderation tools to help monitor their platforms effectively.
  3. Consider 'pre-bunking': Platforms should consider public-information campaigns warning voters about potential misinformation and election-related violence. This could be done through short videos played before YouTube videos or other platforms.
  4. Redesign platforms: Platforms could consider redesigning their platforms to reduce the spread of misinformation and violence. This could include adding warnings to certain posts or redesigning user feeds to throttle "frictionless virality."
  5. Plan for the gray areas: Platforms should develop policies to monitor less formalized groups and individuals that may not have a history of violence but may still pose a threat.
  6. Work together to stop the flow of extremist content: Platforms should collaborate to combat the spread of extremist content and communicate with each other to ensure they are all doing their part to prevent violence.

Overall, these suggestions are just checks and are short of a fundamental restructuring of these platforms to prioritize safety over growth. Critics argue that the size and centralized nature of these platforms make them dangerous and that fixing social media may require a different approach. Ultimately, it remains to be seen whether these companies will take the necessary actions to prevent election-related violence. With the 2024 U.S. presidential election just months away, the question is whether these companies will spend the time wisely.

Unfortunately, based on past history, one can reasonably draw the conclusion that these companies will likely continue to prioritize their profits over the health of American democracy.

This is an ongoing story that will be interesting to keep tabs on.

Read more