Final Pixel offers virtual production R&D guidance | Infrastructure | News | Rapid TV News
By continuing to use this site you consent to the use of cookies on your device as described in our privacy policy unless you have disabled them. You can change your cookie settings at any time but parts of our site will not function correctly without them. [Close]
Global creative studio Final Pixel, which specialises in end-to-end virtual production for film, TV and advertising, has published an in-depth study based on a research project looking at the use of motion capture in virtual production.
Final PIxel 13 Dec 2021
The case study, At The Edge Of The Metaverse: Live Body And Facial Motion Capture for LED Wall Virtual Production, with Rendering of High-Quality Digital Characters in Real-time, is based on a research project devised by Final Pixel and incorporated live-action body and facial motion capture of a detailed creature animation into Final Pixel’s current virtual production workflow.

This enabled the team at Final Pixel to understand the limits of the software and workflow to give future clients the opportunity to incorporate detailed motion capture digital characters in their virtual productions. Final Pixel added that its team achieved live facial and body motion capture streamed to Unreal Engine and played through Disguise, using cluster rendering to render a high-quality bespoke 3D character built using a traditional CG pipeline with an extremely high level of detail. It was also able to create real-time interactions between the characters in-camera with no noticeable latency for the viewer.

Final Pixel sees a number of potential uses of this approach, many of them it says are significant. For example, it says that the virtual production pipeline for film, TV and advertising allows for live interactions between digital and human characters, all filmed in real-time and in-camera. Live-action mocap with creatures and characters which can then be replaced by full-scale CG in post - thereby capturing more ‘natural’ actor reactions and engagement verses use of green screen. It also sees use in creating increased fidelity augmented reality plates, in particular for live broadcasts and when using the enhanced stage management provided by Disguise.

The project was shot at the Digital Catapult’s Virtual Production Test Stage (VPTS), a joint venture with Target3D.

“As a company specialising in virtual production for film, tv and advertising, we are excited by the opportunities working in real-time game engines can provide for the creative process when everything can be captured in-camera while shooting live-action,” explained Final Pixel CEO Michael McKenna.

“The next evolution of this technology is to look at the elements which are still considered too heavy or complex to move out of the post production workflow. Digital characters and creature work is a big area for this, and also extremely important for storytelling narrative. The recent Unreal Engine 5 release from Epic games showcasing Keanu Reeves in the Matrix is a great example of the powerful technology we can now use in virtual production. We are excited to share our findings with the rest of the industry to help us collectively move the use of virtual production forward.”