Skip to content
Dot 01 November 2021 3 min read

Pixotope and Motion Capture integration

Creating virtual studios and characters

Pixotope is designed to work seamlessly during live projects, providing an intuitive interface to users looking to generate realistic CG graphics and animations in real-time using the Unreal Engine. To ensure that Pixotope users are able to create the best content possible in any medium, Pixotope comes with direct software and hardware integrations with the world’s leading technology solutions.

One of Pixotope’s key integrations is with Xsens’ motion capture tracking technology. Xsens’ inertial motion capture suits track the full-body movements of individuals using on-body sensors that do not require cameras. This allows the suits to be used in multiple environments, both inside and outside, removing the need for a dedicated performance capture studio. Pixotope and Xsens have complete integration, allowing users to track the full-body movement of individuals and use the data to drive in-engine animations of digital humans or CG-avatars in real-time. A virtual studio or a virtual environment can be designed in Pixotope to house these CG avatars, so that users can create both live and pre-recorded content. The ability to use Xsens’ suits in multiple environments means that users can track motion capture data in a television studio or an event setup and generate live CG graphics directly on a green screen or LED Wall using Pixotope. 

For example, IKINEMA demonstrated real-time, Xsens-driven digital characters at NAB, with the character and the virtual environment generated using Pixotope. 

 

The ease at which live, animated characters and environments can be generated using Pixotope provides users with unique and creative ways to enhance their own bespoke content.   Xsens recently demonstrated this integration with its very own virtual studio used to host webinars. 

Xsens Virtual Studio

Xsens decided to create a virtual studio to host its own webinar series on Xsens product releases and news. The company started by creating a physical space inside the Xsens Enschede office in the Netherlands, equipping the room with green screens, Sennheiser G4 wireless microphones, and one Sony BRC-H800 camera. The virtual studio itself was designed entirely in Pixotope’s engine.

Footage from the room recorded from a camera is sent to Pixotope’s engine, allowing the user to see the virtual, CG studio in real-time, while real people and CG elements are on display. Xsens can have a real person enter the studio using green screens and add in CG-animated avatars. It’s possible to drive the movements of these avatars using data tracked from the Xsens suits – Pixotope integration makes this simple and straightforward. 

Using motion capture integration 

New and current users of Pixotope can easily tap into our software’s inherent motion capture integration for a broad range of creative projects. Pixotope’s engine comes in-built with the ability to generate live, AR characters inside virtual studios and environments using the Unreal Engine, enhancing projects in entertainment and education. 

 

Companies looking for innovative ways to launch new products, explain new features, or engage with their audience in new and immersive ways can use Pixotope to generate professional content. As shown in Xsens’ virtual studio, it’s possible to create high-end virtual environments without the need for a high-budget, dedicated studio – Pixotope’s interface makes this straightforward without compromising on quality. 

Educational institutions and platforms can also take advantage of this same technology to generate virtual, hands-on educational experiences, using Pixotope to sculpt realistic CG environments and motion capture technology to provide physical interactions. This could be useful for simulated physical experiences, such as medical training.

Contact a member of the Pixotope team today to find out more about how your company or studio can utilize Pixotope’s powerful engine and heighten your creativity. 

COMMENTS