- English
- Spanish
- Portuguese
- French
- German
- Italian
- Polish
- Dutch
- Swedish
- Mandarin and Cantonese
- Japanese
- Korean
- Russian
- Turkish
- Hindi
- Bengali
- Arabic
- Filipino and Tagalog
Can you imagine only training AI off of COD voice chat?
Well it's training to detect speech and presumably "intent", not to generate anythingCan you imagine only training AI off of COD voice chat?
*shudders*
Now this would actually be incredible. People developing an entirely new language to get around moderation. There's already tons of gaming specific lingo and terms, might as well go all the way.everyone in COD about to create their own language. it'll be interesting how well this works
Lol, I hope they don't use that data for training because then we are all doomedCan you imagine only training AI off of COD voice chat?
*shudders*
Why do people always assume these systems take automatic actions when they flat out say it's meant to auto flag usersAll it'll take is a couple thousand misfire bans and this whole thing will collapse like a house of cards.
My only slight worry... is that how does it know if a person says: "Let's go kill that player", how does it know the difference between obviously talking about the game or an actual threat?
200 sps (slurs per second)Can you imagine only training AI off of COD voice chat?
*shudders*
People don't talk in game because they're in party chats.this is why people dont talk in game chat anymore. I expect tons of bans of cod players lmfao
this is why people dont talk in game chat anymore. I expect tons of bans of cod players lmfao
It would be way worse than that.
How exactly would you have human moderators listening in on every game?
There's no need to speculate. The site for the service they use literally lists all the languages it supports/partially supportsI'm pretty sure this AI only understands english and that's going to be a problem.
People speaking other than english might get false banned due to this AI.
Something like this can happen:
https://gamerant.com/apex-legends-japanese-players-ban-run-word/
Imagine if two Norwegian dudes are playing CoD and talking to each other. Then they say something that AI thinks is racist, even though is not. That could easily lead to false bans.
There's also chance that people with speech impediment might get false banned.
However it seems that current system is just flagging at the moment, but knowing activision they might eventually get rid of human moderators and automate their jobs with AI.
I don't know why people in this thread are saying "great" when the risk of players getting false banned is high.
Modulate
www.modulate.ai
Seems most are in beta (Spanish) or Alpha (everything non-English or Spanish)
How exactly would you have human moderators listening in on every game?
They definitely should use it for training just not for content generation. What better source would you have to detect toxic behaviorLol, I hope they don't use that data for training because then we are all doomed