The $50 Million Movie ‘Here’ De-Aged Tom Hanks With Generative AI

On Friday, TriStar Pictures released Here, a $50 million Robert Zemeckis-directed film that used real-time generative AI face transformation techniques to portray actors Tom Hanks and Robin Wright across a 60-year span, marking one of Hollywood’s first full-length features built around AI-powered visual effects.

The film adapts a 2014 graphic novel set primarily in a New Jersey living room across multiple time periods. Rather than cast different actors for various ages, the production used AI to modify Hanks’ and Wright’s appearances throughout.

The de-aging technology comes from Metaphysic, a visual effects company that creates real time face swapping and aging effects. During filming, the crew watched two monitors simultaneously: one showing the actors’ actual appearances and another displaying them at whatever age the scene required.

Metaphysic developed the facial modification system by training custom machine-learning models on frames of Hanks’ and Wright’s previous films. This included a large dataset of facial movements, skin textures, and appearances under varied lighting conditions and camera angles. The resulting models can generate instant face transformations without the months of manual post-production work traditional CGI requires.

Unlike previous aging effects that relied on frame-by-frame manipulation, Metaphysic’s approach generates transformations instantly by analyzing facial landmarks and mapping them to trained age variations.

“You couldn’t have made this movie three years ago,” Zemeckis told The New York Times in a detailed feature about the film. Traditional visual effects for this level of face modification would reportedly require hundreds of artists and a substantially larger budget closer to standard Marvel movie costs.

This isn’t the first film that has used AI techniques to de-age actors. ILM’s approach to de-aging Harrison Ford in 2023’s Indiana Jones and the Dial of Destiny used a proprietary system called Flux with infrared cameras to capture facial data during filming, then old images of Ford to de-age him in post-production. By contrast, Metaphysic’s AI models process transformations without additional hardware and show results during filming.

Rumbles in the Unions

The film Here arrives as major studios explore AI applications beyond just visual effects. Companies like Runway have been developing text-to-video generation tools, while others create AI systems like Callaia for script analysis and pre-production planning. However, recent guild contracts place strict limits on AI’s use in creative processes like scriptwriting.

Meanwhile, as we saw with the SAG-AFTRA union strike last year, Hollywood studios and unions continue to hotly debate AI’s role in filmmaking. While the Screen Actors Guild and Writers Guild secured some AI limitations in recent contracts, many industry veterans see the technology as inevitable. “Everyone’s nervous,” Susan Sprung, CEO of the Producers Guild of America, told The New York Times. “And yet no one’s quite sure what to be nervous about.”

Even so, The New York Times says that Metaphysic’s technology has already found use in two other 2024 releases. Furiosa: A Mad Max Saga employed it to re-create deceased actor Richard Carter’s character, while Alien: Romulus brought back Ian Holm’s android character from the 1979 original. Both implementations required estate approval under new California legislation governing AI recreations of performers, often called deepfakes.

Not everyone is pleased with how AI technology is unfolding in film. Robert Downey Jr. recently said in an interview that he would instruct his estate to sue anyone attempting to digitally bring him back from the dead for another film appearance. But even with controversies, Hollywood still seems to find a way to make death-defying (and age-defying) visual feats take place on screen—especially if there is enough money involved.

This story originally appeared on Ars Technica.

Facebook
Twitter
LinkedIn
Telegram
Tumblr