Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, I prefer to be skeptical of any corporation, regardless if it is non-profit or not, until proven otherwise with substantial transparency on their methods of moderation and control.

There is a lack of transparency on Wikipedia. The rules are nebulous and prone to abuse by veteran users and the oligarchs aggregating on political articles.





Hold on, their moderation methods are as transparent as could possibly be. Every article has a dedicated page where every decision has a reason and more often than not an overwhelming amount of discussion. Their overall policy is similarly debated publicly.

Is it overwhelming? Oh yes. Tough to change? Probably also yes without dedication and sound reasoning. But opaque? Certainly doesn't fail that criteria.


It certainly becomes opaque when it is a labyrinth of links and documents that you need to read and follow through. It does not help when these same rules can be abused to death by veteran users.

At a certain point, no one really knows the devil's dance happening at the top of the moderation ladder and you end up wasting a lot of lifetime on these dead talk pages.

It is a bureaucratic nightmare.


I don't really see how it's possible to be transparent about decisions for something this large without it being somewhat complex to follow the paper trail. If enough stuff from a discussion is written down, it's going to be complex. If the discussions aren't recorded in a publicly accessible way, that's clearly even less transparent. And if the scope of the discussions for something as large as Wikipedia haven't scaled with proportion to the amount of content being discussed, most of the decisions would probably not have been discussed at all, either due to individuals making them on their own (which is not particularly transparent, since there's no visibility into how they reached those decisions), or it was automated in some way (which is at most only as transparent as it would be to have the human who implemented the automation directly in charge of making the decisions, but in practice often is the least transparent option of all because most of the time automated moderation is almost always relying on ML or something similar).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: