Call of Duty began to use AI tools to prevent Voice chat from being toxic

04th September 2023
Call of Duty began to use AI tools to prevent Voice chat from being toxic

In online gaming, toxicity is a major issue, and voice chat is one of the worst offenders. Players can use voice chat to hurl insults, threaten, and even spread hate speech. 

Activision, the publisher of the Call of Duty franchise, is taking steps to combat toxicity in voice chat. Call of Duty: Modern Warfare III will feature a voice chat moderation tool developed by the company using artificial intelligence called ToxMod.

ToxMod uses machine learning to identify toxic speech in real-time. This includes hate speech, harassment, bullying, and discrimination. When ToxMod detects toxic speech, it will submit a report to Activision, which will then take action against the offending player. This could include suspending or banning the player from the game.

The AI voice chat moderation tool will be available in the initial beta release of Call of Duty: Modern Warfare III in North America on August 30. It will be rolled out worldwide (excluding Asia) on November 10, the same day the game is released.

In addition to the AI voice chat moderation tool, Activision is taking other steps to combat toxicity in Call of Duty. These include:

  • Text-based filtering in 14 languages for in-game text (chat and usernames)
  • A robust in-game player reporting system
  • Education and awareness campaigns to teach players about toxicity and how to report it

All Call of Duty players are assured of a safe and inclusive environment by Activision. The AI voice chat moderation tool is a significant step towards that goal.

How ToxMod Works

ToxMod uses machine learning to identify toxic speech in real-time. The tool first analyses the words and phrases used in voice chat. It then compares these words and phrases to a database of known toxic speech. If there is a match, ToxMod will flag the speech as toxic.

ToxMod is still under development so it could be better. It may sometimes generate false positives, identifying non-toxic speech as toxic. However, Activision says that ToxMod is constantly improving and becoming more accurate.

Benefits of ToxMod

The AI voice chat moderation tool has several benefits. It can create a safer and more inclusive environment for all Call of Duty players. Also, it can be used to reduce hate speech, harassment, and bullying on the game's servers.

ToxMod is also a cost-effective way to combat toxicity in online gaming. Developing and deploying an AI tool is much cheaper than hiring human moderators to review voice chat recordings.


Activision's AI voice chat moderation technology will help make Call of Duty a safer and more welcoming community for all players. Although the programme is still under development, it has the potential to be extremely helpful in the battle against online gaming toxicity.


What is the difference between ToxMod and other AI voice chat moderation tools?

ToxMod is one of the first AI voice chat moderation tools in a major video game. It is also one of the most accurate tools available.

How does ToxMod handle false positives?

ToxMod is designed to minimise false positives. However, it is still possible for the tool to incorrectly identify non-toxic speech as toxic. If this happens, the player who was flagged can appeal the decision.

Is ToxMod available in other games?

ToxMod is currently only available in Call of Duty: Modern Warfare III. However, Activision says that it is considering making the tool available in other games in the future.

ALSO READ: Ways To Sell In India: Tips For E-commerce Marketers

Post Your Project