Skip to Content

Social Media Censorship

Social Media Censorship

Several days after the Charlottesville protest, where neo-Nazis clashed violently with counter-protestors, social media platforms, including Google and GoDaddy, began shutting down neo-Nazi and white supremacy pages. Daily Stormer, a neo-Nazi and white supremacist website, was among many right-winged pages that were removed due to hateful content.

However, this censorship begs an important question: Is it a violation of the First Amendment for social media platforms to monitor hate speech?

The Electronic Frontier Foundation (EEF), an organization dedicated to defending digital world rights, believes that it is a violation.

“Protecting free speech is not something we do because we agree with all of the speech that gets protected.”

Electronic Frontier Foundation

“Protecting free speech is not something we do because we agree with all of the speech that gets protected,” said the EEF in an article that was published after recent events. “We do it because we believe that no one — not the government and not private commercial enterprises — should decide who gets to speak and who doesn’t.”

The organization also cited concerns that companies might begin to abuse their filtering power and extend the practice beyond just hateful content to the media we use everyday.

The CEO of Cloudflare, Matthew Prince, agrees. Cloudflare is an organization that provides internet content delivery, through internet security services and domain name services. Prince said censoring hateful content will set a precedent that any type of censorship should be allowed.

Cloudflare, unlike GoDaddy and Google, slightly delayed removing Daily Stormer from their services.

Yet, seeing that Cloudflare had not intervened with them on their systems, Daily Stormer began to suggest that Cloudflare was a fan of the Daily Stormer. In  Cloudflare’s blog, Prince recalls that the Daily Stormer claimed that Cloudflare employees were “secretly supporters of their ideology.”

Angry, annoyed and frustrated, Prince regretfully called his legal team and gave his team the go-ahead to take down Daily Stormer. Prince later publicly apologized and criticized himself for doing so.

“Literally, I woke up in a bad mood and decided someone shouldn’t be allowed on the Internet.”

Matthew Prince

“Literally, I woke up in a bad mood and decided someone shouldn’t be allowed on the Internet,” Prince said in an email to Cloudfare staff. “No one should have that power.”

In saying that he decided that someone “shouldn’t be allowed on the Internet,” Prince is referencing his removal of Daily Stormer from Cloudflare’s services.

In a blogpost later that day, Prince said, “You, like me, may believe that the Daily Stormer’s site is vile. You may believe it should be restricted. You may think the authors of the site should be prosecuted. Reasonable people can and do believe all those things. But having the mechanism of content control… subverts any rational concept of justice.”

Similar to the EEF, Prince is opposed to taking down these hateful pages because he thinks it’s a violation to take content down due to the ideas that it contains. He thinks doing so would be a violation of free speech.

Google has also taken steps to monitor hateful content. In February, the company unveiled a new system that uses an algorithm to filter content based on its levels of profanity, rather than the unpopularity of its ideas that it contains.

Its message was clear: anything can be said, as long as it is done professionally.

Once the Charlottesville Protest, happened Google started censoring hateful content, in addition to profanity.

Google’s original intent was to keep users from exemplifying excessive profanity. However, once organizations began cracking down on hateful content, Google joined in.

Many Paly students disagree with EEF and Cloudflare’s motives and disagree with those who believe that hateful speech should be preserved.

Junior Bridget Leonard believes that online content should be monitored by social media platforms.

“Hate speech…[is] offending and harming [to] other people,” Leonard said. “It should be restricted.”

Since the rapid censorship of hate speech, free speech has obviously been in question. Organizations like the American Civil Liberties Union of Northern California (ACLUNC), an organization that works to defend free speech, are expected to take action. However, ACLUNC has not issued their stance since the protests.

When asked about their perspectives on the banning of neo-nazi and white nationalist pages across online platforms, the ACLUNC did not provide anybody to speak on the subject. They also declined to be interviewed by phone.

Senior Nicholas Blonstein believes that since social media platforms are private corporations, they should have the right to filter content. He believes that digital free speech is different, because the social media companies set their own terms.

“[These platforms] are private corporations and therefore they have their right to restrict content on their websites,” Blonstein said. “I agree that it’s their right to speak, but that’s only guaranteed in public, and because it’s a private company, that right is surrendered when they’re using their product. That company sets those rights themselves and [these corporations] have the right to [set those rules].”

Paly United States history teacher Justin Cronin agrees and said that users who get censored still have free speech, just not on the digital platform they are using.

“If you agree to those terms of service and then break your terms of service they should have the right [to remove the content because] they are a business,” Cronin said,  “I think people still have free speech; they just can’t tweet out to their one million users if Twitter decided to shut them down.”

The argument of private corporations’ right to censor content brings up the similar issue of net neutrality: the idea that Internet service providers (ISPs) should enable fair and equal access to all products and websites. Failing to comply with net neutrality laws is illegal, but there have been recent efforts by government officials to overturn the net neutrality laws currently in place.

Should net neutrality apply to private corporations? If Internet service providers are replaced with social media platforms and products, and websites are replaced with ideas, it is, in essence, the same issue.

The entire debate lies on the question of what counts as free speech: something that has been questioned for centuries.

However, now the stakes are higher. With new growing platforms to share ideas, it does not take rallying hundreds of people, nor lighting stuff on fire, nor giving hateful speeches to spread hateful content.

This way of spreading hate is at everybody’s fingertips: social media. It comes in the form of Facebook. It comes in the form of Google. In the form of Instagram, Twitter, Reddit and Youtube.

This way of spreading hate is at everybody’s fingertips: social media. It comes in the form of Facebook. It comes in the form of Google. In the form of Instagram, Twitter, Reddit and Youtube.

This increasing ability to share ideas and content has given a few people the ability to monitor what millions of others say.

As the power to spread content, both hateful and unhateful, has increased for users, the power to restrict content has increased for the media companies if they choose to do so.

Donate to The Campanile
$150
$500
Contributed
Our Goal

Your donation will support the student journalists of Palo Alto High School's newspaper

More to Discover
Donate to The Campanile
$150
$500
Contributed
Our Goal