Democracy Will Only Work If the Social Media Giants Grow Up  09/19/2020 13:00:00   Ann Ravel

Last weekend, the president of the United States urged Americans to vote twice in the upcoming election. This brazenand illegalsuggestion spread quickly across social media and once again underscored the unprecedented risks of this election season: the Covid-19 pandemic, an onslaught of disinformation, and online echo chambers stoking vitriol that could turn to violence. With more Americans than ever working, going to school, and gathering online, social media platforms have an urgent responsibility to step up in order to ensure the integrity of this election. So far, they havent done nearly enough.

As a former chair of the Federal Elections Commission, this is an issue thats near and dear to my heart. During my tenure I made the changing role of technology in our elections a major focus. I know theres a road map to protect our elections. Unfortunately, the FEC does not even have a quorum currently, and therefore cannot take action on this. Quorum or no quorum, the FEC has been discussing online advertising for five years and failed to regulate the industry in any way. Protecting the 2020 election requires social media companies to act now.

To be sure, companies across Silicon Valley have taken some important steps. Facebooks Voting Information Center, Twitters expansion of its civic integrity policy, and YouTubes crackdown on videos using hacked materials are a strong start. But, as we saw in the wake of Trumps double voting comments, their current actions dont go nearly far enough. Unless platforms take additional, proactive steps soon, the United States will be caught flat-footed against disinformation and distrustwhether those seeds are planted by online trolls or the sitting president.

A new Election Integrity Roadmap released by the nonprofit group Accountable Tech shows that a different path is possible. Created in conjunction with leading technologists, civil rights leaders, and disinformation experts, the Roadmap outlines tangible steps that platforms can take to defend the integrity of the November elections. Because the recommendations are grounded in platforms existing policies and technologies, they can immediately be implemented at scale to help social media companies responsibly navigate everything from early voting through the official certification of results.

During the early voting period, as the Roadmap lays out, platforms should implement an election integrity strike system to limit the reach of repeat disinformation super-spreaders. Research shows that a disproportionate amount of harmful misinformation on social media platforms can be tied back to a relatively small number of accounts, groups, and websites. By imposing a series of escalating limitations with each new infraction, platforms can crack down on these actors before the most volatile period of election season.

As Election Day nears, platforms should increase their efforts, including by growing their capacity to monitor electoral content, temporarily turning off harmful algorithms like Facebook Group recommendations that push users towards divisive content, and creating a Platform Poll Watchers program to serve as the first line of defense against disinformation. Just as election observers are deployed at the polls, social media companies would create specialized verification labels for state Election Directors and nonpartisan civil society groups, allowing them to promote credible information, flag specific pieces of misleading content, and counter-message false narratives in real time.

Platforms responsibilities dont end when the polls close. The spread of disinformation as ballots are being counted has the potential to cause chaos and even incite violence. The Roadmap offers a powerful idea: Just as the Voting Rights Act required certain states to pre-clear new voting laws, social media platforms should require highly influential accounts to pre-clear election-related posts, including those by President Trump. These posts would be subject to proactive detection and rapid human review, blocking content that would violate incitement of violence or civic integrity policies.

With less than two months until the election, social media giants have so far shown that self-regulation is no enough. Ultimately, we need Congress to step in and provide clear laws to prevent the spread of online misinformation, but that is not going to happen before November. That's why it's more important than ever for social platforms to act responsibly.

« Go back