A group of folks at Riot Games, who make League of Legends, paired up with some cross-disciplinary scientists and game designers to study League of Legends community and the toxicity that sometimes blooms there. They discovered some interesting—and telling—things about online interactions and how to combat unpleasantness in online communities.
League of Legend has 67 million players overall. As we all know, even one negative interaction or abusive individual can make an online experience pretty unpleasant. That’s especially true if no one comes to your defense. The most interesting thing that the team found—to me, anyway—was that toxicity didn’t often come from individuals who’d previously been identified as abusive. The researchers found that the vast majority of negative behavior came from the neutral or positive-ranked players, rather than the “persistently negative citizens.” Some of these players may have just had a bad day. Others may have been set off by someone else’s negativity.
The team realized that solving this problem wouldn’t be as easy as simply banning persistently negative players. Even pairing negative players only with one another wouldn’t work either. They would have to change the entire community—including expectations about what was normal and acceptable.
This entailed increasing the responsiveness and clarity of feedback when players’ behaviors were flagged as unacceptable. They also created publicly accessible case files for individual players who were flagged. That allowed the community to discuss what they were willing to put up with. On top of that, players were able to make note of positive actions—called “honoring” other players. Every time a positive or negative interaction was noted, the machine-learning system would recognize it. Most importantly, interactions were rated by other players, rather than on the basis of a set of rules. The community was in charge of agreeing on what they found acceptable or unacceptable in their own society.
“It turns out that people just need a voice, a way to enact change,” said Jeffrey Lin, the lead game designer of social systems at Riot Games. “As a result of these governance systems changing online cultural norms, incidences of homophobia, sexism, and racism in League of Legends have fallen to a combined 2% of all games. Verbal abuse has dropped by more than 40%, and 91.6% of negative players change their act and never commit another offense after just one reported penalty.”
“In the office, I still have a copy of a letter a boy wrote me after receiving in-game feedback from his peers about his usage of racial slurs. [The boy said] “Dr. Lyte, this is the first time someone told me that you should not say the ‘N’ word online. I am sorry and I will never say it again.”
Lin remembers forwarding the letter to the entire team, “because this was the moment we realized that we had started a journey that would end beyond games.”