Pixotope is an open software-based solution for rapidly creating virtual studios, augmented reality (AR), driving LED Volumes/extended reality (XR), and on-air graphics. It utilizes powerful commodity hardware and is specifically designed to connect with partner technologies and external data sources.

Man in a hoodie standing in front of the green screen

The future of visual storytelling

The ARVS Edition of Pixotope offers a highly accessible path to embracing virtual production. Combining all the tools you need for Virtual Studios and Augmented Reality.
Pixotope enables media creators to bring the quality and visual impact of high-end feature films to all types of production, big and small. 
Digital avatar Night Journey performing in front of a live audience

Drive Viewer Engagement

Virtual Production enables storytelling that would have been physically impossible without it, bringing concepts and data to life with interactive virtual elements. Virtual Sets can adapt and change “on the fly” to have greater relevance to content and stay interesting. Meanwhile, avatars, virtual talent and animated characters can be brought to life, sparking the imagination of viewers in a stadium, at home, or anywhere they may be consuming content.
Two digital avatars from TV show 2060 dancing around a digital tree

No compromise on performance or quality

To be able to create the graphical fidelity while offering excellent value, PIxotope developed a unique technology that builds on top of  Unreal Engine enabling us to render and composite video and graphics in one pass. Unlike other Unreal-based systems, we do not need to render shadows, reflections, refractions, bloom, and translucency in separate passes which means that we have zero performance overhead for integrating video with such effects. This allows us to put more models, textures, dynamic lights and shadows,  and effects on the screen at the same time and still retain high frame rates.

To ensure optimal video quality and performance, all video input, processing and output runs in a proprietary pipeline.  By separating video processing from graphics rendering we ensure that the integrity of the video is always ensured. 

A modern microservice architecture means that Pixotope is processing rendering, video i/O, chroma keyer, camera tracking, etc in separate processes on the machine, which do not directly interact with each other and therefore ensures that they do not interfere with each other and that they can fail individually without directly affecting other parts. 

Features

  • Unique WYSIWYG live feedback editor
  • Single panel config and operations for multi-camera/render productions, with auto-discovery
  • On the fly configuration. 
  • On the fly switching of levels.
  • Premade and easy to use adjustment panels for all objects in the level, color correction, and image effects 
  • Unique Drag and drop control panel builder for mobile and desktop devices via a web browser on the local network
  • Procedural High-quality Text generation for live animated 3D texts, flying logos, etc. 
  • Timecode triggering, from external LTC or embedded in the video. 
  • Automatic Render API for data integration and automation giving remote access to any part of the Unreal Engine.
  • Datahub, a highly efficient databus enables extremely low latency interactions and synchronization in multi-camera systems