Lights, Camera, Sedition: Why and how Trump & Co. should be removed from online platforms.

Julia Schur
7 min readJan 8, 2021

As armed seditionaries shattered the windows of the U.S. Capitol, taking breaks from bloodshed for photo-ops in Nancy Pelosi’s office, Americans realized the fragility of democracy.

The Vice President, Senators, Representatives, and staff were rushed to secret locations, journalists bravely continued reporting, and Trump continued Tweeting. Many lives have been lost, bruised and imprisoned throughout Trump’s presidency, beginning in Charlottesville in 2017 and hopefully ending January 6, 2021, because words online have an impact in the real world. Trump’s preferred places to display his love for all people who love him-back has been Twitter & Fox News. While Fox News is considered a “publisher,” under the Section 230 of the 1996 Communications Decency Act, Twitter, instead, is considered an “online intermediary,” or a “host.” The difference may seem a subtlety amid this chaos, but it is key to understanding why Trump and his enablers must have their on-line accounts suspended.

The danger Trump poses may diminish when he is no longer Commander in Chief, but his ability to influence extremist members of his base — and others — will remain.

On January 6, following Trump’s messages of hatred and disinformation on Twitter, continuing to claim baselessly the election was “stolen,” his base took over the People’s House. Twitter suspended Trumps account for a few hours.

Later, Facebook’s Mark Zuckerberg announced they will be blocking Trump’s account until President-elect Biden takes office because Trump is “undermining the peaceful and lawful transition of power to his elected successor, Joe Biden.” Zuckerberg’s realization came much too late, as the peaceful transfer of power was already wrecked.

While it may appear futile to block the accounts now, particularly only for hours or for two weeks, it is a step in the right direction for platforms to take, but one that is insufficient and will require some real policy changes. It is important to block Trump’s account as well as those who encouraged the spread of disinformation, ranging from Ivanka Trump and Donald Trump Jr. to Rudy Giuliani, Senator Ted Cruz and Senator Josh Hawley. The list of accomplices should also include the heads of social media platforms who knew that years of passive leniency for Trump’s fighting words on their platform, would boil over to the physical world. The danger Trump poses may diminish when he is no longer Commander in Chief, but his ability to influence extremist members of his base — and others — will remain.

The social media accounts of those seditionaries must be reviewed, to analyze their wrongdoing and the action and inaction of platforms from Facebook to Parler who were on notice. Sheera Frenkel told The Daily that the weeks leading up to January 6 “Stop the steal” groups grew rapidly on Facebook. Journalists reported the groups to Facebook, and then Facebook would take down the groups but “in its short life of 48 hours it had managed to attract 320,000 people but, more importantly, it spawned hundreds of other ‘Stop the Steal’ groups on Facebook and on Twitter and now they’ve got Reddit boards. And so, that inaction by Facebook, that two days that it took them to notice the group and shut it down was enough time to get their followers united under this banner of ‘Stop the Steal.’”

Section 230 fits in the conversation because it protects companies from being held liable for certain illegal content on their platform. Unlike Fox News, platforms don’t have an “editorial” influence on the content their share, at least not in the traditional sense. Carrie Golberg explains in Nobody’s Victim: Fighting Psychos. Stalkers, Pervs. and Trolls, that “Section 230 was originally conceived to shield internet companies that ran online message boards — where the majority of user-generated content appeared online — from legal action traditionally lodged against publishers, like defamation and obscenity claims.”

Since 1996, the internet has dramatically changed and our dependency on it has transformed. Platforms have created new narratives, using algorithms (a form of speech), to bolster targeted advertisements and building extremist base by recommending to people new friends with whom to connect.

As reported by Wired:

Facebook’s own research revealed that 64 percent of the time a person joins an extremist Facebook Group, they do so because the platform recommended it. Facebook has also acknowledged that pages and Groups associated with QAnon extremism had at least three million members, meaning Facebook helped radicalize two million people.”

Furthermore, tech companies coordinate seamlessly between our aps (eg: Facebook, Instagram, Tinder) to strengthen provocative posts, their impact and extend their reach. While the internet changed, the laws that intended to protect people and free speech has only emboldened tech giants to reap all the rewards of viral content and disinformation, with none of the responsibility. “No other media entity — not Fox News, or the New York Times or even the National Enquirer — is allowed to operate as a vehicle for defamation, threats, or the promotion of hate crimes,” explains Goldberg in her book. The person who tops the charts for defamation and threats just so happens to be the man with the highest national political office and the nuclear codes.

We need to remove Trump & Co.’s accounts because:

1. The algorithms used by platforms include basic features such that of recommending friends to connect with, reinforcing our bubbles of like-minded post sharing. Now that the only wall Trump succeeded in building is the one around him and his cult, we need to muffle the message.

2. Social media has changed how people experience the world, just as cars and television have, which calls for appropriate regulations. The attack on the Capitol was an attack on democracy punctuated by selfies, live streams, posed shots within the office of Congressmen and Congresswomen, on the floor of the Senate, under the Rotunda and captured while fraternizing with Capitol Police officers. “At least in their minds, the true seat of power is not actually in that building. It’s online.” Tweeted Elise Thomas, a Freelance Journalist.

Trump encourages unpatriotic behavior and expects to see it carried out on his Twitter home page. Platforms manipulate many of our daily behaviors, (checking our phone compulsively, taking out our cameras for anything and everything) and platforms bear some responsibility in controlling the dangers that come from misuse of its social media culture.

3. The speed of Trump’s disinformation is faster than any platform’s content moderation practices. Platforms are currently using a “post by post” approach to controlling disinformation spread by Trump and his helpers. Instead, it is critical for platforms to get those who break the rules out of the stadium, by tracking who spreads disinformation and other harmful content, and how frequently they do so to stifle the spark before it turns into a ravaging fire. It would be unfinished to solely condemn Trump and his coup following.

4. Keeping Trump on social media only brings more attention to his harmful content. The current take-down practices only add to the perception that conservative voices are being “silenced” and calls for a revolution. (A myth which has been rejected by the courts.) Although to be clear, Section 230 does not require for companies to be neutral, not all speech ought to be hosted online. In fact, because platforms are private entities, they have First Amendment protections for their speech. They can take down decide to flag claims as “disputed,” take down content, and pause or terminate certain harmful accounts. These company announcements, however, end up shining a stoplight brightly on the very content the platform was intending to remove from the public eye.

5. Allowing Trump back on social media following his attempted coup, may further expose platforms to criminal charges. Section 230 is a liability shield, but it has no effect the applicability of Federal criminal statutes.

The Proposal:

Section 230 needs to return to its original purpose, and we must condition the liability shield on reasonable content moderation practices. “Doing nothing has costs[,]” explained Danielle Citron, Professor at University of Virginia & Vice President of Cyber Civil Rights Initiative. In The Internet Will Not Break: Denying Bad Samaritans § 230 Immunity, Citron and her co-author Benjamin Wittes, a Journalist, brilliantly make the following proposal:

“§ 230(c)(1) should be read to apply only to Good Samaritans envisioned by its drafters: providers or users engaged in good faith efforts to restrict illegal activity.[…] A broader though still balanced approach would be to clarify the reach of § 230(c)(1), which could be revised as follows: ‘No provider or user of an interactive computer service that takes reasonable steps to prevent or address unlawful uses of its services shall be treated as the publisher or speaker of any information provided by another information content provider in any action arising out of the publication of content provided by that information content provider.’”

Platforms need to bear some of the responsibility for democratic digital speech to safely exist.

Platforms amplify certain voices, which in turn are used to gather groups of people who think alike and who benefit from viral content spreading in the form of posts or advertising. It has happened before it will happen again unless there is a profound change.

Think of Twitter as a public meeting hall, rented by Trump for a meeting with his supporters (which Twitter helped assemble.) At that meeting, Trump directed his supporters to “walk down Pennsylvania Avenue” and “go to the Capitol” to “try to give our Republicans, the weak ones because the strong ones don’t need any of our help, […] give them the kind of pride and boldness that they need to take back our country.” For many crimes, knowing that an illegal activity is taking place in at a site you control will make you liable for what happens later. This is the model that should be applied to public meeting halls on platforms.

Twitter provided Trump with the virtual space and megaphone to start an insurrection, killing five at last count. Twitter knew enough about what was going on to be investigated as an accomplice. In the words of Dr. Mary Anne Franks, Professor at University of Miami Law School and President of Cyber Civil Rights Initiative, “custodians of virtual spaces should have similar civil rights obligations as the custodians of physical spaces.”

--

--

Julia Schur

Lawyer who writes about privacy, civil rights, intellectual property and technology.