Let’s face it, we’ve probably all had that encounter with a foul-mouthed player while in an online game who shouts racist, hateful, or just obscene remarks. Let’s also fess up then that most of us have at one point or another been in that persons place as well, yelling at teammates or opponents for no other reason than we can. I’ll admit to it at least, and it is a behavior I’ve never been proud of.
With toxicity in gaming becoming a topic with open dialogue, and more importantly, a call to action to change it, we not only have the opportunity to look inwardly and modify our behavior as individuals, but also change the ways in which our gaming environments encourage or allow that behavior.
However, wanting toxicity in gaming to change and actually doing something to change it are two completely different things. One is easy. The other requires time, patience, and planning, especially when you are also trying to grow a game and community into a major e-sport contender like Riot is with League of Legends.
For those unfamiliar with League of Legends, back in 2011 way before the conversation of gaming toxicity hit the popularized mainstream it has today, Riot introduced the Tribunal system. Players who met certain in-game level requirements were given the option to help self-regulate the community by voting on whether or not a given player should be punished or pardoned for reported offenses. Using a combination of chat logs and other metrics players made their decisions or moved onto different cases if they couldn’t decide on a given situation. While not a perfect policing system, especially in the face of League of Legends rising popularity and user base, it was a start, and it gave Riot something to build on.
Fast forward a few years and Riot’s Lead Social Systems Designer Jeffrey Lin known by his handle “Lyte,” reports that 95% of players have not had punishments like chat restrictions or bans placed upon them. While this sounds like a very positive number, one comment aptly pointed out that in a game that largely features 5vs5 matches this means that statistically, 1 toxic player will be present every other game.
Still, it is nice to see that somebody is making their community their problem. This got me thinking why we don’t see this more often. Every system that requires a Term of Service acceptance can or should have a clause that prevents the allowance of violent, racist, sexist, or homophobic speech, and more importantly, act on violators of those provisions.
Even political debates have moderators to make sure things stay on point and civil. Why then would it be expected that gamers exhibit a better sense of self-control given the anonymity the internet provides? The onus of cleaning up the gaming community not only lies with us as individuals, but with the companies and administrators who help run the systems we play and engage on.
Human beings are social creatures that adapt to their environment. If that place is filled with the toxicity that has become all too frequent, and escalating, then others will aspire to those behaviors in an attempt to assimilate and find belonging. For a good introductory look into this real psychological phenomenon then I’d recommend checking out this video series by Crash Course.
This idea of toxic gaming environments skewing what the general population deems acceptable was even acknowledged specifically in regards to League of Legends by Lin in an interview he gave to Kotaku. In this interview Lin explained:
“Our research shows that players do understand what’s crossing the line, and most players believe it’s unacceptable. The behavior we’re hoping to address with our systems is that some players understand what’s crossing the line and believe it is ok, because other games never punished it in the past.”
I’m not advocating that games become a completely PG environment, devoid of confrontation or adult language. What I am saying is that both the individuals who play, and the companies who administrate these online communities need to take equal responsibility in moderating acceptable behavior, and making it clear that toxicity will not be tolerated.