Have you ever braved the chaotic realm of public voice chats in online shooters? If so, you’re well aware of the toxic environment that often prevails. Call of Duty, in particular, has gained notoriety for its vulgar and off-putting voice chat, which can ruin the gaming experience for many players. However, Activision is now taking a stand against this behavior by partnering with Modulate, a company that utilizes AI technology to monitor voice chats and identify bad behavior in games.
Contents
Introducing ToxMod: More Than Just a Word Filter
The AI-powered voice chat monitor, known as “ToxMod,” goes beyond simply filtering out hate speech, discriminatory language, and harassment. According to Modulate, ToxMod can effectively discern tone, intent, and context in human speech. This means that it can identify not only explicit language and rude behavior but also more subtle and insidious actions, such as recruitment for online extremists or attempts at soliciting sexual contact with minors. While ToxMod is currently used primarily in smaller VR titles like Among Us VR, it is set to make a significant impact as Call of Duty’s largest client.
Maintaining a Human Touch
Despite its advanced capabilities, ToxMod does not possess unlimited power. The system records and flags potentially problematic speech, but it still requires review by Activision’s human moderators. These dedicated individuals have the challenging task of determining whether the speech violates Call of Duty’s official Code of Conduct, which prohibits derogatory comments based on race, gender identity or expression, sexual orientation, age, culture, faith, mental or physical abilities, or country of origin. Once violations are confirmed, appropriate moderator actions, ranging from temporary suspensions to lifetime bans, are taken against the offending players.
Current and Future Implementation
At present, ToxMod is being utilized in the multiplayer matches of Call of Duty: Modern Warfare II and Warzone as part of an initial beta phase. The full rollout of ToxMod is scheduled for the launch of Modern Warfare III in November. While the system currently monitors only English chats, Activision has plans to expand its language capabilities in the future, ensuring a safer and more inclusive gaming experience for players worldwide.
With the introduction of ToxMod, Activision is taking a strong stance against toxic behavior in Call of Duty voice chat. By partnering with Modulate and utilizing AI technology, the company aims to create a more enjoyable and respectful gaming environment for all players. Through the combined efforts of AI and human moderators, Activision is working towards ensuring that its Code of Conduct is upheld and that appropriate actions are taken against those who violate it.
How does ToxMod differ from a regular word filter?
ToxMod is more than just a word filter. It has the ability to analyze tone, intent, and context in human speech, enabling it to identify not only explicit language but also more subtle forms of toxicity.
Who reviews the flagged speech recorded by ToxMod?
Activision’s human moderators are responsible for reviewing the speech flagged by ToxMod and determining whether it violates the Call of Duty Code of Conduct.
What are the consequences for violating the Code of Conduct?
Violations of the Code of Conduct can result in various punishments, ranging from temporary suspensions to lifetime bans, depending on the severity and frequency of the offenses.
Which Call of Duty games currently utilize ToxMod?
The beta phase of ToxMod is currently implemented in Call of Duty: Modern Warfare II and Warzone, with the full rollout planned for Modern Warfare III’s launch in November.
Will ToxMod monitor voice chats in languages other than English?
Although ToxMod currently monitors only English chats, Activision plans to expand its language capabilities in the future to cater to a wider range of players.