AR motion capture character content utilises real-time ray tracing and real-time facial animation, live-to-air
Pixotope, provider of live photo-realistic virtual production systems, today announced that RIOT Games used Pixotope (AR graphics), with Cubic Motion (real-time facial animation), Animatrik (motion capture), and Stype (camera tracking), to deliver the first live broadcast containing real-time ray tracing and real-time facial animation, at ‘The League of Legends’ Regional Finals on Sunday 8th September in Shanghai.
However, ray tracing consumes huge quantities of rendering power to achieve this, which is why to date it has only been used for non-real-time visual effects for film and television. But with the release of Nvidia’s RTX series graphics cards, real-time ray tracing has become possible, but not guaranteed. The challenge in the live broadcast TV world is using the incredible power of ray tracing whilst also maintaining standard video frame rates. Pixotope unique native Unreal™ based architecture and single-pass render pipeline, provide a very low rendering overhead, enabling ray tracing processing whilst maintaining video playback frame rates. It's this architecture that has enabled Pixotope team to deliver the world’s first live ray tracing broadcast.