Balanced Content Moderation: Why So Hard?

preview_player
Показать описание
A follow up to one of my twitter videos and prompted by a comment response.
Balanced moderation sounds simple but simply "applying the same rules consistently" sounds" fair" but only until we consider how rulesets can constitute indirect discrimination and conversely how equal application can neglect unequal circumstance.
TL:DR it is nowhere near as straightforward as people make it sound.
Just a few thoughts, anyway!!

My Patreon account for anyone motivated to support my videos here on YouTube:
Or Paypal (inc recurring payments)

My twitter account, in case any of you wish to follow or engage with me there:
Рекомендации по теме
Комментарии
Автор

Imagine if the outraged mob dictated what could be said when you and I started using this platform fifteen years ago. With that in mind, I think you're right.

Mikey-KF
Автор

Some years ago Noel said in a similar context, that there are different modalities of fairness and everyone will prefer the modality which "just happens" to favor them most. So a good heuristic that a moderator is being fair isn't that everyone is satisfied with their decisions but rather that everyone is equally pissed off and equally convinced that they are being screwed over. I liked this observation so much that I've adopted it as my own and it's really helped when I've had to adjudicate disputes

bendrasin
Автор

Do not lie about or slander people, do not threaten people and do not post material that is illegal where the server is placed. Things that are "perceived" as threats, in the way certain activists frame it, are not threats. Things that are "considered harmful", in the way certain activists frame it, are just opinions and opinions are what a main square in the shape of, for example, twitter is supposed to be used for.

Offer users the tools to moderate their own experience, as this is what adults are supposed to do: control their own lives and be responsible for what they themselves subject their senses to. If you venture out in the world you *will* experience things you dislike, or even abhore, but the onus is on you, as an adult, to manage your own feelings and learn what to stay clear of.

Expecting to not have to experience anything but what you agree with is the position of a child and the tantrums we see online are just further evidence that many lack the maturity to actually partake in the free exchange of opinions. The content is very seldom the problem, the immature users are the problem.

Joorin
Автор

I wrote this in a Discord Server where I am a mod:

Let me explain the rules of this server to you:

Rule #1 The mod is always right

Rule #2 If the mod is ever wrong then Rule #1 automatically applies

In the event of a dispute between mods the Zeroith Rule applies

Rule #0 All bow before the Server Manager.

These rules resulted from extensive prototyping which can be shown by the discussion process.

Rules of the server:

Rule #1: We couldn't agree on what Rule #1 should be, so we decided not to have one

Rule #2: People kept on insisting that the debate about Rule #1 wasn't over, so we decided to abandon discussing it.

Rule #3: We haven't gotten around to it, and for God's sake don't suggest any - we still have enough hassle with the other two Rules

😃

michaelnager
Автор

Does it matter if people are unhappy? A compromise is where no-one is happy. You don't apply rules evenly to make people happy, you apply them evenly because its the right thing to do.

gemore
Автор

Transparency of the rules and processes do more than anything else. Without that transparency of clearly defined rules, "balanced" moderation is impossible. What needs to be done (and I think most people would accept it) is to spell out everything considered a violation. Also, clearer explanations for how the content in question went against the rules would help for most people.

For example, let us say I was Elon Musk and I changed Twitter's rules to be the following:

1) No unlawful content.

If someone posts unlawful content (CP, conspiracy to commit a criminal act, etc) then moderation occurs. The user is sent a message explaining that what they posted was identified as whatever type of unlawful conduct. If this report was as detailed as possible (including timestamps for videos and the like), I don't think too many people would complain if they are acting in good faith. If this moderation also ignores political or ideological positions of the individuals' accounts (which hasn't been the case in recent history), it would likely be accepted by good faith actors. That would be balanced moderation and easy to do. People could complain about the guidelines, but so long as the guidelines were clear people can't say the moderation isn't fair and balanced. So long as mute/block/filter options are given to the users for their individual accounts, the end user can further filter their experience to avoid topics they might find objectionable.

glyyytch
Автор

I'll add a bit to what you were saying at the end there: the more you intervene on any particular topic, the more it is that you take ownership of and endorse a given viewpoint. Beyond the problem of tying yourself in knots, which is a huge problem in and of itself, the more it becomes the case that you are actively pushing a particular worldview, as opposed to merely facilitating discussion.

macedindu
Автор

I empathise with your dilemma. I'd expect all 'right-minded' (uncomfortable with that rather subjective adjective, but it will have to do, for now) people to have similar misgivings, no matter on which side of whatever aisle they reside. If not, they just haven't thought about it, sufficiently.

At present, I tend towards treating social media platforms as open noticeboards, which should only censor that which is illegal. In this way, the only front on which we have to fight is the law, with which we may disagree. However, whose laws... where? Even this, it seems to me, is not a simple solution for the platforms to police. One would imagine it be applied to the laws of the poster's region, but may be legal there whilst illegal in the reader's country. If this is easier to organise than I suspect, and I'm missing something, please let me know. Nevertheless, it seems better than the current illogical debacle.

I anticipate one 'answer', that will be proposed, is the ability to identify users. Thereby, the authorities can bypass (and divest of responsibility) the platforms and target the sources. How people will expect that to be implemented and operate, on worldwide platforms, is beyond my current understanding.

As you may suspect, I have no entirely coherent plan, either.

AnthonyKellett
Автор

I actually think even-handedness is probably less important than clarity and transparency. People on the whole accept that these platforms are operated privately, for profit, and that it's not necessarily their job to be guardians of free expression. The business has the right to protect its brand, even if that means excluding some prospective content creators.

What seems to grind the gears of a lot of content creators is that the terms of service are often vague, that they change frequently and that they're applied with no semblance of due process. From there, the inconsistency of application is inevitable. The vaguer the terms, the more interpretation they require and, therefore, the more the application is open to the personal biases of the interpreter.

Even if the terms of service are such that they do indirectly discriminate against some prospective content creators, at least they can be knowable with a sufficient degree of confidence from the outset. Should a conflict arise, furthermore, at least there can be an open and fair process for resolving it. People will accept the terms of a contract, even if they don't like all of the terms. What they don't accept is agreeing to a contract, only for the terms to be twisted, changed or re-defined arbitrarily down the track.

HarrynJessie
Автор

In the end, it is hard because people are different and tribal...and you will never please everybody. So... 1) Never assign to malice that which can be explained by ignorance/stupidity. 2) Realize that echo chambers are not productive places (diversity of thought is good and the only way we grow/improve. So even when upsetting...try to allow and encourage it). 3) If you can't let a comment go, be logical, rational, and level-headed in your response. Educating along the way. ("Because I said so" or "The god of content has spoken" are invalid and ineffective tools.) 4) Moderation will never be "perfect" or "fair" or "balanced"...it can only strive to be so. (Accept that and work to get better at encouraging substantive discussion.)

esc
Автор

Right! We all say we want even-handedness as we know it's what we're supposed to say BUT what we really mean is, content moderation of which my socio-political views approve.

stevecass
Автор

in my opinion the rules should be very very simple. E.g. snuff and child p**n are not allowed and everything else is not a problem as long as no local laws are broken. If someone has a problem with a post, they can send it to the police and they have to decide, if there is something to investigate. Because we are talking about so many countries with so many different laws, this might be difficult, but the posters usually life in a country and have to abide by that countries law.
Most importantly, governments like the EU have to roll back regulation which make platform operators to deputy sheriffs. Also platforms of a certain size (Youtube, Twitch, Facebook, Twitter and maybe 2-3 more) are not allowed to have private rules, because they have become too powerful and they can easily control public opinion.

hamanime
Автор

Yeah, I fully agree with you in that I don't know either. I feel that this is a logical consequence of a more libertarian approach to life. You can't demand freedom and yet want to be taken care off or push the responsibility you have to take care of your own to someone else. I am sure there is a logical argument somewhere in there but I can't yet put my finger on it.

MartinPHellwig
Автор

People have the right to be assholes and idiots, as long as they don't force you to listen or directly advocate for direct harm.

thomaslacroix
Автор

It's not F'ing Rocket Surgery ( a pun on Rocket Scientist and Brain Surgeon )

1: Clear and precise rules.
2: If you are in violation of a rule, you will be informed " E.X.A.C.T.L.Y " in what way, and how, and where, if it was a video with a time stamp, if it was a post, it will be highlighted.
3: After editing the post, or video so it is no longer in violation of terms of service/the rules the post/video instantly gets reuploaded to the site.
4: There will be no arbitrary enforcements of the rules, the rules will be enforced at all times, always, regardless of who was in breach of them or how.
5: A post that is in no way in breach of any of the rules should never be censored in any way.

Temuldjin
Автор

At the end of the day it's hard because different people have different opinions.

You cannot have it balanced so that everyone is okay with it unless you simply don't.

And with few exceptions that is really how it should be just as it is in real life.

You have a right to shut your door and expect not to hear it all night and that's about it.

Consistency is better than what we currently have but it is not the ideal and never will be.

TessaBain
Автор

Social media should be run like a nightclub as far as I can tell

Keep the girls happy and the guys will show up

Why worry about what guys want?

alisonnatasha
Автор

My views on this have changed in recent years. I was originally in favor of very little moderation and allowing people to argue or block and move on. To a large extent I still lean towards little moderation when it comes to political and social issues, where people have differences of opinion.

Your example of black/white lives matter can be both consistent and even-handed if you have manual review to judge not just the words, but the context and likely intent, because that's what makes the difference in this example, right? One side is using a slogan for the cause of helping a disproportionately affected demographic achieve equal treatment and the other is either (a) misunderstanding the point, or (b) intentionally provoking/offending. The problem is the social media companies are trying to make money and the resources necessary for this level of review and moderation is not commercially viable. If they can't automate the moderation, they're way less interested in applying it.

Where my position has more recently changed is in an area that I think is way more problematic; the posting of objectively false or unverifiable information. This includes political misinformation, medical disinformation, conspiracy theories, and other false propaganda. Having seen the effects this type of social media activity can have on what I thought was a stable democracy and how it can be leveraged by foreign powers and others intent on causing harm and disruption, I think a there has to be moderation across mainstream social media platforms against false information.

I like the way some sites linked to fact checkers and/or flagged content as false or questionable, but it came a little late in many cases. I also like the idea of banning repeat offenders from those platforms.

The major problems as I see it are (1) there are some conspiracy theories that are just unfalsifiable, making it impossible to disprove, even though there is no good reason to believe they're true, and (2) some of the people who have been convinced by false information are also convinced the fact checking organizations, the social media sites, the government, and almost every other entity most people consider credible, are in collusion to lie to us and only their conspiracy sources are providing the truth.

woolvey
Автор

Evenhanded from the perspective of where society is, evenhanded from the perspective of what you find acceptable or evenhanded from a word-for-word interpretation of what the rules say? Though that last one can also be interpreted through the lens of the first two, so seems mostly redundant.
Because I'm pretty sure most people want the extremes to be banned and want and evenhanded approach, they just disagree on what the extremes are and what is 'wrong'.

All perspectives seem to have their failings. Because if you approach things from where society is, there will be groups that will try to change where things stand and stretch out the window of what is 'acceptable'. That will makes things more and more extreme on both sides.
And if you just stick to your own ideas and beliefs, you're basically stuck in your own bubble and you'll de facto be discriminating against people with other beliefs. Moreover, this too will create a divided society.
I don't really believe the last one is possible. If people for example define concepts or weight things differently, you can't evenhandedly maintain those rules. But you go into this yourself in the video and I mostly agree with that. Suffice to say: it's an issue.

I think we as a society should establish a broad set of ideas which most people agree on, which don't hinder/damage society and try to approach things from there. So yes, we should be able to discriminate against specific ideas.
That broad set of ideas however doesn't exist -- at least not in practice. It's all just loose ideas that constantly keep clashing and that are constantly interpreted differently. I also don't really see this changing any time soon without any serious governmental intervention.

And sure, you can take a step back and take a very hand-off approach. That is much easier. Then you're however going to get in trouble in the long run. Because what do you think will happen if things go wrong (again), and you get some terrorist that got all his ideas off your platform? You think governments(and the public at large) are going to like that? I for one thing this is not a maintainable strategy for a platform that has gone mainstream.

hjge
Автор

Neither white nor black lives matter - I'm a humanist 🤣

michaelnager
visit shbcf.ru