Content moderation is the hard part of any online social platform. It was a big problem early on with email and email lists, with Usenet, and with any chatroom. It’s a problem on websites, particularly discussion sites and lately with social media. Each platform has its own challenges, but they all have the common issue of dealing with giving jerks a big megaphone to be jerks to others. (Human sinfulness, etc.)
There have been all sorts of ideas about how to manage people’s bad behavior online. Some have tried to solve it with computer algorithms. Some have used community style policing. Some just turn off comments altogether. (I did that last one for a number of years when stuff got out of hand on this blog.) Some have tried all three. All of them though require some sort of norm or moral guidepost to decide. Most, in an attempt to remain above the fray, tend to default to a model that privileges free speech above safe speech.
The Mozilla Foundation, owners of the Firefox tool suite, are trying something kinda different with their planned Mastodon server:
Mozilla Social Private Beta Launch:
You’ll notice a big difference in our content moderation approach compared to other major social media platforms. We’re not building another self-declared “neutral” platform. We believe that far too often, “neutrality” is used as an excuse to allow behaviors and content that’s designed to harass and harm those from communities that have always faced harassment and violence. Our content moderation plan is rooted in the goals and values expressed in our Mozilla Manifesto — human dignity, inclusion, security, individual expression and collaboration. We understand that individual expression is often seen, particularly in the US, as an absolute right to free speech at any cost. Even if that cost is harm to others. We do not subscribe to this view. We want to be clear about this. We’re building an awesome sandbox for us all to play in, but it comes with rules governing how we engage with one another. You’re completely free to go elsewhere if you don’t like them.
I’d love to say those rules will be perfect on day one, but they won’t and we know they never will be. But we’re going to create a way to have a dialogue with the community we want to form, and to be open about what we’re learning.
There’s a good write up about this over on the Verge.
I like this idea. I’m somewhere in the middle on a lot of divisive topics, but I also recognize that there are numerous strategies being used in politics and online to exploit the fact that you can skew conversations to one side by simply dragging the goal posts far to the edge and having the “middle” have to move as a result. (See American Journalism and it’s repeated failures in the past two decades as an example.)
This idea of giving up a stance of “neutrality” and instead basing communication decisions on mission values is, to my mind, the best stance given all the other options. I guess it’s like democracy… the most inefficient method of making decisions, but the best available compared to all the other alternatives.