
How Gamers Exposed a Glaring Issue With Actors’ Likeness and AI Through ‘Fortnite’s Darth Vader
Creatives are increasingly using AI in storytelling. From films and TV shows to video games and online content, AI has become a key tool for many creatives despite the controversial issues that arise from using technology to assimilate human creativity. We’ve seen AI used to de-age an actor’s face or imitate their voice in specific roles, while video games have used it similarly when adapting popular characters from movies.
Most recently, Fortnite introduced an AI Darth Vader NPC chatbot that players can talk to using in-game speech. Players can have lengthy discussions with Darth Vader (James Earl Jones) about any number of topics, and the character has drawn a lot of online attention. Fortnite developer Epic Games was not entirely transparent with their intentions for the famous Star Wars villain when acquiring the rights, prompting a legal response from SAG-AFTRA.
Gamers Quickly Figured Out How To Make Darth Vader Offensive
After striking a deal with Jones’ estate, Epic Games used a mixture of archival audio and AI to generate a non-playable character (NPC) who could mimic his voice as Vader for any word in any language. While his estate did agree to let Epic use his likeness in Fortnite’s new Star Wars season, they could never have imagined how it would be misused.
Initially, the Vader chatbot became a hilarious source of entertainment for Fortnite players. His ability to generate real-time responses allowed players to interact with an iconic movie character like never before. However, after gamers tricked the AI into repeating hateful language, things quickly turned sour. Clips began emerging online of Darth Vader using offensive words, profanities, and racial slurs. This portrays Darth Vader in a very negative light, and Jones by association.
Epic responded quickly, releasing hotfixes that prohibited the AI from being manipulated into repeating offensive words. They also established new parameters for what Darth Vader’s behavior and what he could say, limiting his range of language. Despite the proper response, Epic should have implemented proactive measures that prevented this from happening in the first place. Their lack of foresight as to how players may misuse the chatbot is alarming. While Epic never had any malicious intentions for the chatbot, its situation has broader implications for using AI in creative industries.
Epic Games Has Made Actor-Likenesses Accessible
SAG-AFTRA has taken a strong stance on AI in recent years. Concerns around fair compensation for digital likeness and replacing talent with technology have plagued the industry for several years. AI makes it much harder for actors to earn a living, so SAG-AFTRA quickly responded to Epic’s situation. They filed an unfair labor practice claim against the developers, claiming Epic “refused to bargain in good faith with the union by making unilateral changes to terms and conditions of employment, without providing notice to the union or the opportunity to bargain.” Epic supposedly did not use Jones’ likeness rights in the way they initially claimed to. Epic set a dangerous precedent for other film and gaming studios. Should they avoid any repercussions after a complete lack of transparency with the Screen Actors’ Guild, it could prompt other studios to follow suit.
Related
Anakin Skywalker Became Darth Vader Way Earlier Than You Remember
Not that it makes a difference to the younglings…
The Darth Vader case is the first of its kind. In the past, game developers have used actor likenesses in tandem with the actor’s performance. CD Projekt Red used Keanu Reeves‘ and Idris Elba‘s likenesses in conjunction with their mo-cap performances in Cyberpunk 2077, a fairly common gaming practice nowadays. In cinema, Lucasfilm uses FRAN (Face Re-aging Network), which uses AI to de-age actors like Mark Hamill in The Mandalorian. Once again, the AI tool was used in tandem with the actor’s performance, whereas James Earl Jones’ likeness is entirely separate from the performer and used posthumously.
Giving Audiences Creative License Is Dangerous
Fortnite puts an actor’s likeness in the hands of players. Giving the audience control of a respected, known performer has dangerous implications. Regardless of whether his estate signed off on the rights, Jones’ family won’t want his likeness associated with negative language and behaviors, which could tarnish his legacy. Generally, it’s best not to leave creative control to gamers or an online community. While this may sound cynical, gamers can exploit bugs and manipulate game mechanics to cheat. Roblox is an infamous example where players are given creative liberty to develop any game or character they choose, leading to countless issues around inappropriate content. The problem isn’t exclusive to gamers; any audience can misuse creative AI tools.
Tay is another worrying example. In 2016, Microsoft released an AI chatbot on Twitter, which users promptly flooded with hateful content, causing the chatbot to regurgitate the same sentiments. Individual people always create this kind of software; there’s an endless pool of online videos using AI to copy a famous person’s likeness and use it in a different context. However, an individual person manipulating an actor’s likeness isn’t as harmful as a corporation-backed AI. Not only does the misuse of a chatbot damage the reputation of companies like Epic Games and Microsoft, but it also provides access to those chatbots on a global scale. If this issue bleeds into the film industry, it could have serious ramifications.
Ultimately, studios need to understand the risks of allowing their consumers to influence AI characters, particularly those based on a real person. They must implement strict limitations when making chatbots or AI character recreations accessible to consumers. The simple solution would be to hire a voice actor. James Earl Jones never performed for any Star Wars video games, so various voice actors have been mimicking his performance for decades with great success. However, the entire appeal of an AI chatbot is that there are no limitations to the scope or topic of conversation. The user can tailor their NPC experience, which would be impossible for an actor to do unless they recorded every known word in the English language. Fortnite’s current situation will hopefully become a case study for future creatives attempting a similar project and preventing actor-likeness misuse moving forward.
Source: How Gamers Exposed a Glaring Issue With Actors’ Likeness and AI Through ‘Fortnite’s Darth Vader