Dark
Light

To Combat Toxicity, Game Developers Are Censoring Users — Should They?

Discussing the elephant in the game chat.
September 1, 2018
6 mins read

Doritos. Fourteen-year-old kids. Screaming. Banging your mom. These are images typically associated with the Xbox Live and “Call of Duty” communities. The truth is, though, that these visuals could describe any game with an online mode or any gaming platform. In recent years, online functionality has become so fundamental to the video game experience that most major releases now come attached with an online multiplayer mode or are strictly online multiplayer experiences.

“Call of Duty: Black Ops 4,” for example, which is scheduled to hit shelves in October, will completely eschew any single-player campaign. So, with the gaming market pushing for everyone to play together, it’s become more important than ever that everyone plays nice, which has been easier said than done.

Indeed, the gaming community has already developed a noxious reputation for tactlessness, and the lawless nature of game chats has only exacerbated this predilection. If you are, say, a black man like me and have ever been called a “nigger,” chances are that the encounter occurred during an online gaming session. If you’re even a non-black man with a deep enough voice, you’ve probably been called a “nigger,” too. It goes without saying, of course, that this harassment is 10 times worse if the other players determine that you’re a woman, whether you are or not. So, what can we do about this abuse?

In a perfect world, because most online games come with a “report player” function, every player would use the feature properly and games would be played in harmony. In reality, most players don’t report misbehavior because many game publishers are slow to take action, which leads people to stop reporting. As a result, online games, especially competitive ones, become home to unsportsmanlike behavior and vulgar language. The toxicity can become so unbearable that users stop playing competitively, only play with friends or drop the game altogether. It’s a sad reality, but for years it’s been the status quo.

Recently, however, some game publishers have begun taking a more proactive approach to help improve their gaming climate. The popular team-based first-person shooter “Overwatch” became an instant hit when it was released in 2016 by Blizzard, but once it added a competitive mode the game became the new face of toxicity in just a few months. In January 2018 however, game director Jeff Kaplan released a developer update saying that they’d been able to ban more accounts thanks to players using the report function more frequently.

Kaplan also said that Blizzard would be actively searching for toxic activity on social media accounts, such as YouTube and Twitter, tracing them back to their Blizzard accounts and banning them. While banning players for exhibiting toxic behavior outside the game sounds like a problematic practice, the “Overwatch” community was so bad that the procedure was largely welcomed; so far, no major “invasion of privacy” scandals have arisen.

Nor is “Overwatch” the only popular online shooter to crack down on its less friendly players. “Tom Clancy’s Rainbow Six: Siege” has had toxicity problems since the game became popular, about a year or so after its initial launch. On console, teamkilling makes casual mode basically unplayable without being in a squad, which makes the report function ineffective. On PC it’s not as bad, but there teams can use text chat to send racial and homophobic slurs to the enemy team.

In July, Ubisoft activated a chat filter that banned players who used words the filter deemed inappropriate, including terms like “nibba.” The filter is active in all game modes and first-time offenders are banned for a half-hour. The measure received backlash for not warning players they would be banned, which became am issue because players would trick others into typing a banned word to get them kicked out mid-game. Overall though, the change was largely well received, partly because players were too busy complaining about updates to the game itself and partly because most people don’t enjoy seeing hate speech when they’re playing video games. Who would’ve thought?

Gaming companies have to be careful how they handle toxicity, and they have to remain open to criticism about the systems they implement. However, if gaming is to become a space in which people from all walks of life interact, gamers need to learn that they can’t just say whatever they want. If they need the gaming companies to teach them that lesson through censorship, then so be it.

Christian Nelson, Eastern Michigan University

Writer Profile

Christian Nelson

Eastern Michigan University

Leave a Reply

Your email address will not be published.

Don't Miss