Does Social Media Censorship Cause Extremism? Talking To The Black Musician Who Makes KKK Members Rethink Racism

Why are we so divided?! Whether it’s the war in Ukraine or Covid or the 2020 U.S. election or Black Lives Matter or abortion, it feels like there have never been such great divisions in society.

Recently, I had the opportunity to meet Daryl Davis. He is a swing, blues and rock musician, who has played with Chuck Berry for 32 year. He’s also a black man who has convinced 200 members of the KKK that racism just doesn’t make sense. Davis and CEO Bill Ottman had some thoughts about how extremism can thrive.

“It’s when the conversation ceases that the ground becomes fertile for violence,” Davis says on the TechFirst podcast. “A missed opportunity for dialogue is a missed opportunity for conflict resolution … if you spend five minutes with your worst enemy, you’ll find something in common. This chasm will begin to shrink. Spend another five minutes, you find more in common and it closes in more.”

There’s a strong perception among people who identify with the right side of the political spectrum that the major social platforms from big tech companies censor or limit their political speech. Donald Trump was the former president launchedA class action lawsuit was brought against Facebook, Twitter, YouTube and YouTube in 2013. Tens of thousand of Americans provided examples of evidence. Elon Musk Slammed Twitter’s alleged “strong left wing bias.”

Whether they’re right or not, there’s no doubt that Facebook and other social media giants are intervening more and more in the content they publish, whether gun ownership second-Amendment posts or Information about accessing abortion pillsA world after Roe v. Wade.

A Facebook friend who doesn’t seem insane regularly shares instances of where Facebook deletes or hides her content.

In many cases the reasons seem silly or arbitrary, like an AI that doesn’t really understand the content or get the joke. One shows a floating tent, captioned “Floating tent sleeps 4 and offers a cool new way to die while camping.” Other deletions seem more understandable, like the thumb with a face on it and a string tied around in a shape like a noose: it’s not explicitly about lynching, but it’s clearly intending to evoke that imagery. It’s a poor joke and likely to offend. But is it acceptable?

Facebook often gets things wrong.

“My account has been restricted,” another friend recently . “Someone posted how cockroaches were under the benches in HB and I wrote ‘Burn them all down.’ I meant the bugs, but okay Facebook. Lol.”

But while there’s the mistaken and the comical, there’s also the Covid deniers and the anti-vaxxers and the election conspiracy theorists. The decision of when to censor is difficult, if it’s not impossible, seems almost unachievable.

Elon Musk, whose deal to “save free speech” and hunt the bots on Twitter by buying the platform has fallen through thanks to — according to Musk — the bots on Twitter, had a different standard. As the legal wrangling around that terms of his extrication from his legal obligations begins, it’s worth considering that standard: the law.

That’s persuasive to a degree, but it also has risks. One is the Reasons Facebook introduced Covid misinformationpolicy is to save lives. We can see that misinformation can cost lives, such as the Highland Park shooting and January 6, violence. That misinformation can spread faster than any law that could be enforced or codified. So it’s understandable that social media networks have felt it necessary to take action.

The question remains: Does social media censorship encourage extremism?

Or, in other words, are the large social media platforms making the problem more difficult by banning dangerous or false content? Perhaps a gated community that creates an island of privilege within an ocean of poverty.

Bill Ottman believes so, despite the fact that some illegal content should be censored.

“What do you expect if you throw someone off a website, where do they go?” the CEO asks. “Well, you just have to follow them and you see that they go to other smaller forums with less diversity of ideas, and their ideas get reinforced and they compound.”

This makes sense intuitively, obviously.

People are inherently social, most of the time, and if they can’t speak their minds on Twitter or Facebook or YouTube, they’ll find Truth Social or Rumble or Gab or Gettr. Or a Telegram channel that can’t easily be censored, or any of dozens of right-wing or conservative outlets … or left wing, if that’s their persuasion.

Problem is, once they do get there they might just find themselves in an echo chamber full of ideas which leads them further down the rabbit hole to more extremeism.

“On Minds, we do have pretty strong diversity of thought,” Ottman says. “And so we are an alternative forum where people do go sometimes when they get banned. But I wouldn’t say their views are necessarily amplified when they come because we do have diversity of opinion.”

I believe that’s the goal, but I haven’t personally seen that on Minds, I have to say.

In trending tags around #humor, I see a meme about why Biden hasn’t been assassinated yet: “In case you wondered why someone shot Shinzo Abe but not Sleepy Joe … Professionals have standards.” A recommended account has a meme about Trump Towers being the new Florida Guidestones offering suggestions about how to depopulate government, playing on the recent Georgia Guidestones monument destruction. I have found that anything other than pro-Trumpian is met with anger and invective.

Perhaps that is just the proof.

Sometimes, it might make sense to have people who are different from you, offending, or just plain wrong, on Facebook, YouTube, Twitter. It will give them a chance for communication and allow them to glimpse alternate realities. Particularly if social media platforms’ algorithms are changed to show more of the things we love so that we remain on them and make more revenue, but also give us other viewpoints.

Which runs the risk, of course, of making the platforms a living hell for those who don’t want to be confronted by extremist, nasty, or just ill-informed opinions all the time. Anyone else noticeably decreases their Facebook time pre- and post 2020 U.S. election?)

Davis suggests that maybe discomfort can be a worth sacrifice, if we are able to adjust our perspective about what offends us.

“I’m up the mindset that I cannot offend you. You can only allow yourself to be offended,” he says. “People say a lot of offensive things. And whether I want to be offended by it or not is up to me.”

Are we willing to allow that offending behavior so that others are not offended? This will help heal some divisions in the society.

Davis suggests that it could at most help decrease extremism.

“I don’t think kicking people off of Twitter or Facebook, whatever, causes extremism. What it actually does is cause them to take a route that might lead to extremism. The extremism already exists, and they’re on different platforms and different areas. It’s not uncommon to get kicked from something and move somewhere else. And it’s quite possible that you might go in that direction to somewhere where it already exists, and it embraces you and welcomes you and amplifies you.”

TechFirst is now availableGet a Full transcript of the conversation.

Source link