Following the major controversy related to Electronic Arts’ practices involving Star Wars Battlefront 2's loot boxes and microtransactions, there’s no doubt that most avid gamers are now acutely aware of the ever-increasing influx of companies using insidiously manipulative tactics to goad players into spending even more money in games. With this being the case, many fans have been optimistic that publishers and developers will begin reducing the presence of loot boxes and microtransactions in future titles. However, a dump of newly leaked documents suggests that the situation could turn much worse before getting any better.
Just yesterday, a trove of pictures purportedly taken from an unnamed digital marketing firm's draft presentation was uploaded to Imgur, with slides detailing how developers can use both algorithms and artificial intelligence to prod players into purchasing microtransactions to increase in-game revenue streams. According to the documents, the nameless "Data Broker" has refined the ability to have AI implement "persistent bait-and-switching" methods that can "alter a player's game as a whole," which is then brazenly dubbed "social engineering." Not to mention, the company points to how AI can also modify "the player's individual gameplay experience" in real time by way of "psychological manipulation tactics."
Additionally, the leaked documents go into an incredible amount of detail regarding how developers can use artificial intelligence to essentially spy on players and learn almost anything it wants about them. Apparently, all of the information gathered on a person is then used to build a unique psychological profile that is designed to drive them toward microtransactions.
The information dump is chock full of the ways in which advertisers can potentially vacuum up data on people and use it against them, such as data siphoning from mobile phones, cameras that gauge facial cues and reactions, Wi-Fi signals to 3D map homes, and more. However, one of the creepiest and most disturbing tidbits explains how AI can actually hear, interpret, and analyze audio cues overheard by smartphones and chat headsets. Whatever one says within earshot of a nearby mic can potentially be used as learning materials for the AI to successfully detect race, gender, mood, and even menstrual cycles, that can then be used to influence players through emotional or logical means to buy microtransactions, or to persuade players to neglect their personal responsibilities and focus on the game.
Should these documents be legitimate and not the meticulous mock-up work of a truly dedicated troll, almost all of the details contained in the leak make it easy to be upset, and perhaps even frightened about how developers can wield complex algorithms and artificial intelligence systems as financial weapons to exploit people's minds into spending money. After all, one would be naive to think that these kinds of practices would extend exclusively to video game firms trying to wring every penny out of players. For if companies like Activision and EA can use these methods with AI to successfully spur folks into spending their cash on cosmetics and loot boxes, it's not hard to imagine multiple geopolitical scenarios playing out that involve the technology being used in a much more malevolent manner.