In Wikipedia’s 18 years of existence, it has become a fixture in our lives: It ascends to the top of Google’s search results and provides answers to the questions we ask Alexa and Siri.
For Wikipedia’s editing community, the website is even more. It is a kind of social network where users debate the minutiae of history and modern life, climb the editorial hierarchy and even meet friends and romantic partners.
It is also a place where editors can experience relentless harassment. In 2016, Pax Ahimsa Gethen, a trans male Wikipedian, was persistently hit with personal attacks over several months. Mx. Gethen, 49, who uses the pronouns “they” and “them,” said the anonymous harasser posted that they were “insufferable” and “unloved,” that they belonged in an internment camp and that they should kill themself.
Mx. Gethen said the user also publicly posted their deadname, the name they used before transitioning.
“It was devastating to me because of the personal nature of it,” Mx. Gethen said.
Unlike at social networks such as Facebook and Twitter, the people who handle reports of harassment on Wikipedia are largely unpaid volunteers.
In response to complaints about pervasive harassment, the Wikimedia Foundation, the San Francisco-based nonprofit that operates Wikipedia and supports its community of volunteers, has promised new strategies to curb abuse. In recent months, the foundation has rolled out a more sophisticated blocking tool that it hopes can better control the harassment plaguing some users.
Sydney Poore, a community health strategist with the foundation, said that when the free encyclopedia was established in 2001, it initially attracted lots of editors who were “tech-oriented” men. That led to a culture that was not always accepting of outside opinions, said Ms. Poore, who has edited Wikipedia for 13 years.
“We’re making strong efforts to reverse that,” she said, “but it doesn’t happen overnight.”
A ‘barrier’ to gender equity
A few informed clicks on any Wikipedia article can reveal the lengthy discussions that shape a published narrative. According to interviews with Wikipedians around the world, those digital back rooms are where harassment often begins. A spirited debate over a detail in an article can spiral into one user spewing personal attacks against another.
“If you out yourself as a feminist or L.G.B.T., you will tend to be more targeted,” said Natacha Rault, a Wikipedia editor who lives in Geneva and founded a project that aims to reduce the gender gap on the website.
On French-language Wikipedia, where Ms. Rault does much of her editing, discussions about gender can often spark vitriol. Ms. Rault said there were six months of heated debate about whether to label the article on Britain’s leader, Theresa May, with the feminine version of “prime minister” (première ministre), rather than the masculine one (premier ministre).
Another controversy has been simmering over the article “femme,” the French word for woman, Ms. Rault said. At issue is whether the first paragraph should refer to gender in addition to biological sex, and whether transgender women should be included in the definition of woman. This debate devolved into an “edit war,” a heated back-and-forth in which Wikipedians continuously edit an article to overwrite the other side’s changes and reflect the language they want.
“Sometimes it can be so aggressive that you give up and run away from the article,” she said.
The idea that cisgender women and transgender editors could be repelled from Wikipedia by online abuse is of serious concern to the Wikimedia Foundation.
On its website, the foundation lists pervasive harassment as a barrier to gender equity. Sometimes, the harassment is explicitly sexual: According to anonymous interviews described by the foundation, users have had pornography posted on their personal Wikipedia userpage and emailed to them. Camelia Boban, an editor on Italian-language Wikipedia, said another user once publicly used language to suggest she was a prostitute.
Studies on Wikipedia’s contributor base from several years ago estimated that fewer than 20 percent of editors were women. This research backed up an existing awareness in the Wikipedia community that female editors were seriously underrepresented, galvanizing activists who set out to recruit more women to write and edit articles.
Groups like Art+Feminism were established to increase the representation of women and nonbinary individuals on Wikipedia. Its organizers held sessions in which experienced editors taught aspiring ones the ways of Wikipedia, explaining how to navigate a website where editors sometimes appear to be communicating in code.
Wikipedians also began to discuss the “content gender gap,” which includes an imbalance in the gender distribution of biographies on the site. The latest analysis, released this month, said about 18 percent of 1.6 million biographies on the English-language Wikipedia were of women. That is up from about 15 percent in 2014, partially because of activists trying to move the needle.
The perils for L.G.B.T. editors
Claudia Lo, a foundation researcher on a team that is building anti-harassment tools, said there was a pattern of harassment on Wikipedia stemming from debates over L.G.B.T. issues. When a celebrity comes out as transgender — notably, Chelsea Manning in 2013 and Caitlyn Jenner in 2015 — Wikipedians have extensively debated whether the individual’s self-declared pronouns should be used.
Articles about transgender or nonbinary individuals are often subject to vandals who revert their pronouns back to their gender assigned at birth. But Wikipedia’s guidelines make clear that editors should use the gender that the subject of the article most recently stated in a dependable source.
In countries where it is more dangerous for L.G.B.T. individuals to be open about their identities, harassment on Wikipedia can be particularly virulent. Once, an administrator on a Wikipedia page blocked an editor simply because their username suggested that the editor could be gay, said Rachel Wexelbaum, a Wikipedian who works to improve L.G.B.T. content on the website. Eventually, she said, Wikimedia’s Trust and Safety Team got involved, and the administrator was blocked for those actions.
In some spaces, the environment for L.G.B.T. users has improved. Amir Sarabadani, a 25-year-old Iranian software developer with Wikimedia Foundation’s chapter in Germany, said that in his 12 years of editing Persian-language Wikipedia, users were often hostile while editing articles related to homosexuality.
About six years ago, Mr. Sarabadani started talking openly about being gay in conversations with other Wikipedians. He said other editors often accused him of having a “homosexual agenda,” and anonymous users posted lewd images of male genitals on his userpage.
But Mr. Sarabadani, who left Tehran for Berlin two years ago, said he thinks his work as an administrator on Persian-language Wikipedia has made the community of editors there less tolerant of abuse.
“It was easy to say things and get away with that,” he said. “Now if someone says something homophobic, they’ll get blocked for several months.”
Identifying and punishing harassers
On English-language Wikipedia, one of roughly 300 languages with its own site, users are asked to report conflicts with one another on online notice boards, where they are expected to post links to abuse so an administrator can decide whether to take action.
That method problematically forces complaints into the public sphere, said Ms. Lo, the foundation researcher.
“If you’re being harassed,” Ms. Lo said, “the last thing you want to do is tell your harasser that you’re about to tell on them.”
In some cases, situations are dealt with by Wikimedia Foundation staffers or via private emails with volunteer administrators, she said.
If a volunteer administrator finds the allegations of abuse credible, the user can be banned from editing anywhere on the site. Administrators for some Wikipedias, such as the English-language site, can also declare a “topic ban,” a socially enforced tool in which other editors are responsible for making sure the guilty user is not involved in editing articles that mention prohibited subjects. Violating a topic ban can result in a sitewide ban.
The new tool, which the foundation calls “partial blocks,” allows administrators to restrict users from editing particular pages on which they have proved to be a problem. Those developing the tool hope it will be used more liberally to block editors from specific topics without entirely banning users who are productive in other areas of the site.
“The idea is to provide volunteer administrators with a more targeted, more nuanced ability to respond to conflicts,” Ms. Lo said.
Partial blocks are active on five Wikipedias, including those in Italian and Arabic, and foundation staffers expect it to be introduced to English-language Wikipedia this year. The foundation is also in the early stages of a private reporting system where users could report harassment, Ms. Lo said.
But there are limits to how effective institutional change can be in curbing harassment on Wikipedia. In the case of Mx. Gethen, their harasser kept posting from different IP addresses, making it difficult for a blocking tool to be effective.
Although the abuser no longer haunts their internet presence, Mx. Gethen said the sometimes hostile culture on Wikipedia has reduced their editing on the site.
“I’m not getting paid for this,” they said. “Why should I volunteer my time to be abused?”