The Avengers need AI: VFX and CGI

AI technology is being applied to many different use cases across all industries. Hollywood filmmakers are ones leveraging it particularly well to revolutionising the VFX for blockbusters. AI, ML and deep learning have transformed the VFX (visual effects) and CGI(Computer Generated Imagery) effects used to bring many movies in the Marvel saga to life. Capturing an actor’s emotions can be really difficult, especially when that team of animators have to render the acting into a 9ft purple alien Thanos. He is the most powerful villain in the Marvel Cinematic Universe and he feels genuine emotion, so after 19 movies worth of buildup, when he finally entered the limelight in the movie Infinty War, he had to look perfectly realistic and natural.

‘ We knew Thanos has to work, or the movie doesn’t work’ -Kelly Port from the VFX company Digital Domain

Getting his expressions to look real was imperative. Josh Brolin’s performance was flawlessly captured and presented to the audience in the form of Thanos using a machine learning software called Masquerade. This saves the animators the trouble of having to tweak every single facial expression manually hence saving time. Other labour-intensive tasks such as rotoscoping, composting and animation were often outsourced to cheaper foreign studios. With these progressing deep learning models, many of these previously laborious tasks can now be fully automated at no cost! This has improved the quality of work as many expert animators can now focus on the bigger picture with an artistic eye sine they only have to refine each effect that the software yields. This tech has transformed how animation studios run as it allows for higher predictability of resources and delivery on time.

“The change in pace, the greater predictability of resources and timing, plus improved analytics will be transformational to how we run a show.”- Simon Robinson

A scene being shot for Avengers Infinity War between Thanos and Iron Man. Source- Marvel Studios

A scene being shot for Avengers Infinity War between Thanos and Iron Man. Source- Marvel Studios

Masquerade needs a scan of Josh Brolin’s face as he performs to bring Thanos to life. This scan is achieved by placing over a 100 tracking dots on his face, captured by 2 vertically oriented HD cameras and a low-quality rendering is all that is required.

"[Masquerade] takes that low-resolution mesh and it figures out what high-resolution shape face would be the best solution for that,[…]Then it gives you a solution, and then we would look at that result. If it didn't feel quite right, we would make a little tweak in modelling to adjust ... let's say this has more lip compression or the brows need to be higher, we feed that back into the system and it would learn from that via a machine learning algorithm."

The machine learning models improves over time to deliver better on the user’s liking. This meant Josh Brolin kept the tracking dots on during breaks and in between filming the scenes to maximise the amount of raw data being fed into the machine learning model. This will help it improve and learn more about Brolin’s facial expressions hence his style of acting.