Hate and bullying drive the actions of one in four moderators on Minecraft servers, study finds

One in four moderation actions on three private servers for the popular video game Minecraft are in response to online hate and bullying, according to a study published today by ADL (Anti-Defamation League) Center for Technology and Society, in collaboration with Take This, Gamersafer and the Center for Terrorism, Extremism and Counterterrorism at the Middlebury Institute.

“As with many online games, we found that a large number of Minecraft users experience hate speech and harassment while using the platform,” said Jonathan Greenblatt, CEO of ADL. “From this snapshot, it is clear that Minecraft and the gaming industry in general need to do more to ensure that their online spaces have strong community guidelines and that they give researchers access to more data and information on their servers.”

The study found that of all content that elicited a moderator response (including a ban, warning, mute, or server ban), 16 percent were the result of harassment and 10 percent were the result of identity-based hate.

Throughout this study, ADL found that:

  • Many criminals in the game are repeat criminals. Nearly a fifth of offending users had multiple actions taken against them during data collection.
  • Hate messages are 21% more likely in public chats than in private chats. Identity-based hate messages were 21 percent more common in public chats.
  • Servers with detailed community guidelines were associated with more positive social spaces. Of the three servers reviewed, the server with the most extensive community guidelines and the highest moderator-to-player ratio had the lowest frequency of sexually explicit, hateful, and severely toxic behavior among users, suggesting the positive impact of strong guidelines. .
  • Temporary bans proved to be an effective solution to reprimand bad behavior. Preliminary evidence shows that temporary bans are more effective than muting in reducing the rate of offensive behavior by the moderate player.
  • Hateful rhetoric is common in gaming spaces. The presence of slurs previously only affiliated with white nationalism and hate groups suggests the normalization of extreme language in gaming spaces.

A previous survey published by ADL last year revealed that extremist messaging continues to be a concern in online gaming: one in 10 young gamers and 8 percent of adult gamers were exposed to white supremacist ideologies in online multiplayer games .

To better address hate and bullying on its platform, the ADL recommends that Minecraft take the following steps:

  • Invest in strong community guidelines and content moderation efforts. Active and effective human moderation and community guidelines are critical to reducing sexually explicit, hateful, and extremely toxic behavior in gaming spaces, as the server with the most staff and the most extensive guidelines had the fewest. incidents of this type of behaviour. Industry leaders must continue to invest in facilitator training to better understand and respond to toxic behavior.
  • Increase researchers’ access to data. Without providing researchers with access to raw data, the games industry cannot identify and address the challenges of hateful, harassing, and toxic behavior.
  • Conduct additional research on content moderation and complementary tools and techniques. Moderator intervention appears to reduce harmful behavior in the short term and at the individual level, but it is unclear if this is sustained over time and across the server. Future research should focus on determining the long-term and aggregate effects of moderator intervention.
  • Standardize reporting categories. To better understand the frequency and nature of hate in online spaces, we recommend an industry-wide standardization of moderation reporting, including defined categories and offenses with clear descriptions. This would help facilitate future research, particularly in regards to documenting how moderation actions change user behavior over time. ADLs Interruption and damage in the framework of online games could be used as a basis for this effort.

Building on ADL’s century of experience in building a world without hate, the Center for Technology and Society (CTS) serves as a resource for technology platforms and develops proactive solutions to combat hate both online and offline. CTS works at the intersection of technology and civil rights through education, research, and advocacy.

Read more on ADL

LEAVE A REPLY

Please enter your comment!
Please enter your name here