Wikipedia plans to crack down on harassment and other “toxic” behavior with a new code of conduct and moderation process. The Wikimedia Foundation Board of Trustees oversees Wikipedia, among other projects.
It voted, on Friday, to adopt a more formal moderation process and will draft its details late this year. They intend to enforce stopgap anti-harassment policies until then.
The board said “harassment, toxic behavior, and incivility in the Wikimedia movement are contrary to our shared values. It is detrimental to our vision and mission.”
The board does not believe they have made enough progress. They say this in terms of creating a welcoming, inclusive, and harassment-free space where people can contribute productively and debate constructively.
The trustee board gave the foundation four specific directives. It’s supposed to draft a binding minimum set of standards for the behavior on its platforms, with input from the community.
Banning, sanctioning, or otherwise limiting the access of people who break that code should be carried out. It also needs to create a review process that involves the community. Moreover, it must significantly increase support for and collaboration with community functionaries during moderation.
The Wikimedia Foundation is also supposed to put more resources into its Trust and Safety team, beyond those directives. This includes more staff and better training tools.
The goal of the trustee board is to develop sustainable practices and tools that eliminate harassment, toxicity, and incivility. Those tools that promote inclusivity, cultivate respectful discourse, reduce harm to participants, protect the projects from misinformation and bad actors. They also promote trust in their projects.
Wikimedia Hasn’t Done Enough to Fight the Harassment
Wikipedia’s volunteer community can be intensely combative and highly dedicated. It tends to launch editing wars over controversial topics. It harshly enforces editorial standards in a way that may drive away new users.
The foundation listed harassment as one factor behind its relative lack of female and gender-nonconforming editors. These are the individuals who have complained of being singled out for abuse.
This project grew out of a freewheeling community-focused ethos. Many users object to the kind of top-down enforcement one would find on a commercial web platform.
These problems came to a head last year when the foundation suspended a respected but abrasive editor. Other users accused him of relentless harassment.
The intervention bypassed Wikipedia’s normal community arbitration process. During the backlash that followed, several administrators resigned.
The board of trustees hasn’t mentioned that controversy. It only says the vote formalizes years of long-standing efforts. These were the efforts of individual volunteers, Wikimedia affiliates, foundation staff, and others, to stop harassment and promote inclusivity on projects.
However, on a certain discussion page, an editor quoted the suspension for an argument. To clarify, the argument was that the foundation shouldn’t interfere with Wikipedia’s community moderation.
Meanwhile, others said a formal code of conduct would have reduced the widespread confusion and hostility around it.
Wikipedia has become one of the most widely trusted platforms on the internet. YouTube, for example, uses Wikipedia pages to rebut conspiracy videos.
That has raised the stakes and created a huge incentive for disinformation artists to target the site. Going further, Friday’s vote suggested the Wikimedia Foundation will take a more active role in moderating the platform.