Facebook and a lack of tech regulation are responsible for the insurrection on Capitol Hill
Far-right extremism has always been prevalent in America, but the platforms they use to organize and disseminate misinformation are relatively new.
On Wednesday afternoon, supporters of President Donald Trump stormed the Capitol in a violent insurrection in which four people died and many more could have. The insurrection came following a Trump rally that took place blocks away, where Trump told thousands of loyal participants exactly what he wanted them to do, saying:
"And after this, we're going to walk down there, and I'll be there with you, we're going to walk down... to the Capitol and we are going to cheer on our brave senators and congressmen and women. And we're probably not going to be cheering so much for some of them. Because you'll never take back our country with weakness. You have to show strength and you have to be strong."
For anyone who has been paying attention to Trump’s rhetoric for the past four years (“stand back and stand by”), particularly following the recent election that he claims was fraudulent, this act of domestic terrorism was, if not inevitable, a very likely outcome. But Trump is not the only one to blame for this.
Behind Trump is an online apparatus of social networks — with Facebook at the center — that enable right-wing extremism to proliferate through disinformation and organization. There has always been a massive far-right embedded in the U.S. state machine, including the police. Trump has energized it and given it a focus, while the social networks have given it a platform.
Facebook (and platforms like it, including the Facebook-owned Instagram) have enabled extremists to gather and organize on their platform both at home and abroad. It has become a gathering place, if not a promoter, of violent dissension and extremism. And despite suspending Trump’s account until at least inauguration day in reaction to the insurrection, the company still has no coherent plan to limit the spread of misinformation and violence it enables.
Unfortunately, this was not the first time Facebook has gotten blood on its hands. Far from it.
The day Kyle Rittenhouse shot three people and killed two of them during anti-police-brutality protests in Kenosha, Wisconsin in August 2020, a Facebook event page by a group calling itself Kenosha Guard urged people to come to the city with guns to protect property in a what one founder called a “general call to arms.” Despite multiple people reporting the page for promotion of violence, it wasn’t taken down until after the shooting.
Even after Rittenhouse’s arrest and indictment for double murder, people are still celebrating and promoting his acts through Facebook. According to The Guardian’s Julia Carrie Wong, a fundraiser for Rittenhouse was shared more than 17,700 times on Facebook, in violation of the network’s rules that prohibit praise or support of mass shooters.
Abroad, the effects of the social network have been even more concerning. As Jacob Silver writes in Baffler Magazine:
By now Facebook’s ability to catalyze violent behavior—and to help perpetrators of violence organize, find their targets, and broadcast their massacres—should be well known. In Myanmar, Facebook served essentially as the Rwandan radio of the Rohingya genocide. In India, it’s been used to foment pogroms, while frenzies of forwarded misinformation on WhatsApp lead to periodic lynchings of innocent travelers mistaken for child abductors. (The Q Anon-style fear of rampant child endangerment is a prominent feature of Facebook-borne misinformation.) In some predominantly Muslim countries, Facebook has been used to target religious and sexual minorities. In the Philippines, President Rodrigo Duterte has used the platform to propagandize against and persecute his political enemies. In New Zealand, a white supremacist streamed his mass shooting of Muslim worshipers live on Facebook. In a few Middle Eastern war zones, arms dealers used private Facebook pages as a virtual bazaar.
In Canada and America, Facebook is threatening our democracies through the company's massive scale, its targeted advertising capabilities, its function as an all-seeing surveillance platform, and the presence of state-driven disinformation campaigns. But it is now also a place where people are becoming indoctrinated to far-right extremism through misinformation and conspiracy theories, and where those same people go to organize violent events such as the insurrection on Capitol Hill.
As New York Times tech reporter Kevin Roose pointed out on Twitter, the mob that stormed the Capitol is an extremely online one. It is one that has been feeding into conspiracy theories for the past few years; conspiracy theories (including election fraud) pushed by Trump through platforms like Facebook and Twitter, which not only allow them to exist but promote them through algorithms that favour radical content. As the Wall Street Journal reported in May 2020, Facebook’s recommendation algorithms tend to push users toward more extreme right-wing pages and topics, which produce heavy engagement. In the United States, Fox News and the Daily Wire are two of the most popular publishers on Facebook, where their Republican rhetoric and beliefs mix with those of white supremacists, anti-Semitics, and violent conspiracy theorists.
All of these platforms — Facebook, Twitter, Youtube — have suspended Trump’s accounts following the insurrection. “For attempting to start a civil war you are banned from our website for 12 hours” is how the joke goes. But will this be a turning point for the platforms to enact more strict measures that discourage misinformation and extremism at home and abroad? If not, will the newly elected Democratic congress take steps to regulate these companies? Will they at least flag these online groups as the terrorist threats that they are? After all, if these groups weren’t made up of white people, they would have been flagged years ago.
Donald Trump is no longer going to be the president, but the ideologies that propelled him forward are still being spread and the people who believe in them are still organizing. There will likely be more violence, and there will likely be more blood. Trump is responsible, but so too are the social platforms that enable violent extremists to gather, organize, and disseminate misinformation and propaganda.
Until the platforms do something drastic, our democracy is in peril. If it’s not already too late, it will be soon.
Notes
https://thebaffler.com/the-future-sucked/a-most-violent-platform-silverman
https://abcnews.go.com/Politics/trump-told-supporters-stormed-capitol-hill/story?id=75110558
https://techcrunch.com/2021/01/06/facebook-stormthecapitol-hashtag-dangerous-organizations-policy/
Question (please respond in the comments)
Are you in favour of regulating what people can do and say on these social platforms? Or do you believe these spaces should favour freedom of speech?
If you enjoyed this post, I hope you’ll share it with friends using this link:
And subscribe to get the newsletter delivered to your inbox for free using this link:
I’m always looking for writing work! If you have any leads, email: orenweisfeld@gmail.com or DM me on Twitter: @orenweisfeld. My published work can be found here.