by Diya Barmecha
The internet has allowed people to hide behind a screen. This anonymity has given people, especially teenagers anew found confidence that they can do whatever they want without getting consequences. Knowing that no one can know their true identity from the game because of the username features, players can get even more nasty as compared to what they wouldn have when meeting someone face to face. This confidence from anonymity is also the main reason for cyberbullying. There has always been a problem with the language used in games while they play. With the function of live chat and sometimes voice chat, players can talk to complete strangers.
This issue has sparked the interest of Amazon as they are always looking for a new venture. They are currently working on an algorithm that will aim to pair such “toxic” people with like minded people. There are some people who are okay to use bad language whereas others aren’t. While playing a game, many would prefer to play with friends rather than complete strangers and the main reason behind that is the comfort in knowing that your friends are like you and to some extent think like you. This algorithm would do just that.
Amazon filed a patent for the same in december 2017. However, it had only gotten approved in the past few months. It says that, “One mechanism for dealing with such players is to isolate all ‘toxic’ players into a separate player pool, such that one toxic player is paired only with other toxic players”.
This is similar to a MOBA title by Valve called DOTA 2 or Defense of the Ancients 2. In their game, players have a ‘behaviour score’ attached with their other player information. This behavior score fluctuates depending on a player’s performance in a game. Players with a low behaviour score are paired with people like them. Your Behaviour score is decided on how often you use a select database of words and other trigger markers.
One of the hardest parts of this would be to define a ‘toxic’ player and a ‘non-toxic’ player. It is easy for a human to access who could be compatible with who. However, it is harder for a computer algorithm to understand and implement this. This is because ‘toxic’ behaviour is a very broad category. Some players find it ‘toxic’ if a player quits a game between the match while others find swearing and using bad language as ‘toxic’.
The patent further describes that players will be able to choose behavior preferences so that their game goes smoothly. For example a player who swears a lot may be matched with other players who swear. Whereas players who quit mid-match will be matched and lobbied with other players who quit mid-match. In doing so, they are letting a computer define some players as toxic when they might not be as toxic in real life as they seem behind a computer screen. All in all, this new venture seems exciting to Amazon, it may not be as attainable as it seems to us.