this post was submitted on 02 Sep 2023
197 points (93.4% liked)
Technology
59148 readers
2310 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is the best summary I could come up with:
Publisher Activision said the moderation tool, which uses machine learning technology, would be able to identify discriminatory language and harassment in real time.
Activision's chief technology officer Michael Vance said it would help make the game "a fun, fair and welcoming experience for all players".
The issue is exacerbated in popular multiplayer games due to the sheer number of players, with around 90 million people playing Call Of Duty each month.
Activision said its existing tools, including the ability for gamers to report others and the automatic monitoring of text chat and offensive usernames, had already seen one million accounts given communications restrictions.
Call Of Duty's code of conduct bans bullying and harassment, including insults based on race, sexual orientation, gender identity, age, culture, faith, and country of origin.
Mr Vance said ToxMod allows the company's moderation efforts to be scaled up significantly by categorising toxic behaviour based on its severity, before a human decides whether action should be taken.
The original article contains 357 words, the summary contains 160 words. Saved 55%. I'm a bot and I'm open source!