In my Innovation Trends 2023 in the Entertainment Industry article last month, I predicted more “AI created content”. And the latest AI-generated project that caught my eye is a Seinfeld spoof that has popped up on Twitch recently and earned thousands of fans in just a few days. The “TV show” called Nothing, Forever mimics the format of the 1990s sitcom Seinfeld, where a group of friends discuss the goings-on of their lives. In its heyday, the popular sitcom was known as “the show about nothing”, and the AI-generated web show pays homage to this with its name. So, will AI-generated TV shows be the future of entertainment? Well, AI is already helping to create and produce movies and TV Shows all over Hollywood.
AI and Hollywood
Hollywood has always been at the forefront of technological advancements, and the integration of Artificial Intelligence (AI) into the entertainment industry is no exception. From scriptwriting and pre-production to post-production and special effects, AI is transforming the way movies and TV shows are created and distributed. With the ability to analyze vast amounts of data and make predictions, AI is helping filmmakers and producers make informed decisions, resulting in more engaging and captivating content. This article will explore the impact of AI on Hollywood, and how it is changing the way we experience and consume entertainment. Whether it’s through advanced special effects or personalized recommendations, AI is changing the way we interact with the big and small screens.
I let ChatGPT write this “intro for article about AI and Hollywood”, and it was spot on. Because one of the aspects I would like to dive into are, in fact, advanced special effects.
The Uncanny Valley Effect
In 2008, I finished my thesis on “Animated Soap – a new Genre?” in cooperation with Europe’s leading daily drama producer UFA Serial Drama. It was all about could we produce a daily soap opera in a video game? My hypothesis was based on what was happening with machinima at the time. “Machinima” (= machine + cinema) is the use of real-time computer graphics engines to create a cinematic production.
Machinima appeared on TV, starting with the series Portal on video game-oriented channel G4. During its two-season run from 2002-2004, it was one of the most popular shows on the network. Written, produced, and hosted by Dave Meinstein, Portal was a sketch comedy and news show focused on the genre of massively multiplayer online games (MMOs), blending satire and stylistic elements. Also, MTV’s Video Mods in 2003 re-created music videos using characters from video games such as The Sims and Need for Speed: Most Wanted. And in 2006, Blizzard Entertainment helped to produce a part of Make Love, Not Warcraft, an Emmy-Award-winning episode of the animated comedy series South Park, in its massively multiplayer online role-playing game (MMORPG) World of Warcraft. One year later in 2007, HBO became the first TV network to buy a work created completely in a virtual world by purchasing the broadcast rights to Douglas Gayeton’s machinima documentary Molotov Alva and His Search for the Creator.
Even though, there were plenty of successful examples at the time, in terms of the genre soap opera my conclusion was that because of the costs and the uncanny valley effect it wouldn’t be possible. The “uncanny valley” is a concept by Japanese Robotics professor Masahiro Mori that suggests that humanoid objects that imperfectly resemble actual human beings provoke uncanny or strangely familiar feelings of uneasiness and revulsion in observers. In other words, the uncanny valley is the region of negative emotional response towards robots that seem “almost” human. And since soap operas are mainly about drama and emotions, the uncanny valley was an issue for an animated soap opera.
AI + VFX = Vanity AI
However, 15 years later Matt Panousis, COO of Monsters Aliens Robots Zombies (MARZ) – best company name in Hollywood in my opinion -, suggests we’re two years away from overcoming the uncanny valley effect.
His company has developed a VFX tool called Vanity AI that has been used by the MARZ internal VFX team on over 27 Hollywood productions, saving clients roughly $8 million in costs and shaving nearly 100 weeks off production schedules. The technology was used to deliver work for shows including Wednesday, Spider-Man: No Way Home, Stranger Things (S4), Gaslit, First Ladies, Being the Ricardos, and others for Hollywood studios such as Marvel, Disney, Apple TV, Netflix, AMC, and NBC Universal. It’s a production-ready solution that empowers VFX teams and Hollywood to deliver large volumes of high-end 2D aging, de-aging, cosmetic, wig, and prosthetic fixes.
What I found interesting is that he said on The Town podcast with Matt Belloni that we’re two years away from bringing dead actors such as James Dean back to the screen as digital versions, while Vanity AI can already rejuvenate actors – allegedly Tom Cruise in Top Gun: Maverick – by 10-20 years for low costs today.
Another AI company in this space called Metaphysic recently revealed that it has entered a strategic partnership with CAA to develop generative AI tools and services for talent. In a statement, Joanna Popper, chief metaverse officer at CAA, said the agency looks forward to working with Metaphysic, as its tech “combined with their ethics-first approach and thought leadership” could “unlock an incredible opportunity for the entertainment industry and beyond.”
Metaphysic is already known for the parody TikTok account @DeepTomCruise and deepfakes featured on America’s Got Talent. And now their new generative AI-driven tool Metaphysic Live will be used on Tom Hanks and Robin Wright, along with additional cast members, to de-age them for Robert Zemecki’s upcoming Miramax movie Here, which is an adaptation of Richard McGuire’s graphic novel that is set in a single room and follows its inhabitants over many years.
The Metaphysic Live tool can be used to create high-resolution photorealistic faceswaps and de-aging effects on top of actors’ performances live and in real time without the need for further compositing or VFX work.
What does that all mean for the future of actors? Will AI eventually replace actors? The answer is not a simple yes or no.
Will AI replace Actors?
On one hand, AI has the potential to revolutionize the way special effects are created in movies and TV shows, enabling filmmakers to create realistic digital characters that can be used in place of human actors. This could reduce the need for actors to perform physically demanding stunts or scenes in dangerous conditions. In addition, AI can also be used to create virtual doubles of actors, allowing filmmakers to create shots that would otherwise be impossible to film.
On the other hand, AI is still in its early stages and lacks the emotional depth and nuance that human actors bring to their performances. The art of acting involves more than just physically performing a scene, it also involves bringing emotions and depth to a character. This is something that AI has yet to master, and it’s unlikely that it will be able to replicate the level of emotional intelligence and human connection that actors bring to their performances anytime soon.
While AI may be able to enhance and augment the role of actors in movies and TV shows, it is unlikely to replace them entirely – even if we overcome the uncanny alley effect. Human actors will always be an important part of the entertainment industry, and their unique ability to connect with audiences and bring characters to life will continue to be valued and sought after. However, with its ability to enhance the viewing experience and improve the efficiency of the production process, AI is poised to play an even more significant role in the entertainment industry in the years to come. In the end, AI is not just changing the face of Hollywood (actors) but also the way we create, produce, and interact with movies and TV shows.