

Backstage is our high-performance media server software for real-time video playback, GPU-accelerated pixel processing, and multi-output mapping, engineered for show-critical applications. It enables real-time multi-layer compositing and simultaneous multi-surface output mapping across displays, LED walls, and projection systems, including edge-blended multi-projector setups, adaptable to arbitrary screen geometries and complex output topologies. Outputs can be configured as straight GPU feeds, rectangular splits, or custom pixel-mapped layouts to match demanding LED and projection pipelines.
Backstage operates with a future-proof 8 / 10 / 12-bit processing pipeline, optimized for uncompressed image sequences to deliver uncompromising image quality and high-bandwidth performance. It also supports selected modern codecs such as NotchLC, HAP, H.264/H.265/H.266, AV1, VP9, and MPEG2, plus common encryption (CENC) for protected media workflows. Supported sources include still images, multichannel audio, Notch Blocks, HTML5, NDI (HQ/HX), Spout/Unity textures, and capture input through dedicated hardware.
For synchronized multi-server deployments, Backstage provides frame-accurate playback synchronization via PTP v2 (self-generated or external) and supports LTC timecode input/output through audio or dedicated hardware. The render architecture supports multiple output refresh rates via render groups and can align GPU output timing to the media frame rate, with optional frame blending for frame rate conversion.
Color management is handled per media and per output, supporting SDR and HDR pipelines through configurable color primaries (BT.601, BT.709, BT.2020, DCI-P3, AdobeRGB, custom RGBW) and transfer functions (Gamma, sRGB, ST.2084/PQ, HLG, Log, Linear). Backstage provides passthrough when compatible and conversion when required to ensure consistent reproduction across heterogeneous systems.
For show control and integration, Backstage includes a node-based visual programming environment interfacing with internal playback/render components (sequencer, assets, mappings, surfaces, mixer, outputs) and external systems via UDP, TCP, ArtNet, and sACN. It supports headless operation with UI streaming to Windows and macOS clients, enabling multi-user access and custom operator control panels.
The output subsystem supports GPU display pipelines, NDI, ArtNet, shared memory/texture exchange, audio routing, and recording to DPX, SBSM, H.264, and AV1. Pixel processing includes real-time keying (color/chroma) and advanced blend/compositing modes.
Backstage also supports stereoscopic workflows via stereo containers and multiple 3D formats including side-by-side, top/bottom, interleaved, passive/active 3D, and whiteline code.
Backstage is streaming its GUI to the network to the Backstage Client - a tool that comes with the server and that is available for PC and MAC. It is fully multi-user compatible: User1 can work on setting up warping while User2 is doing logic programming and User3 is modifying media sequences. After startup, the client will show all available servers in the network.

By double clicking on one of the servers, the GUI for the selected identity will be show.

The GUI consists of flexible arrangeable and dockable windows, the layout is 100% adjustable. Presets can be saved and recalled individually for each user.
Backstage allows you to build your "real world" environment in its powerful 3D editor. It can be as simple as just adding a display or any super complex multi-channel (3D) projection on any screen surface. This sample scene shows a panorama projection with 2 projectors.
To get the projector view to a GPU output, simply route the projector in the Video Routing window. The Output View will show you the actual image sent to the output.

When not working with our camera based auto calibration software, you can apply an output warping manually.

Now load your media through the asset manager and put it into the sequencer by drag and drop. A video channel in a sequencer can be routed to one or multiple screens and mappings in the 3D scene.

Specify triggers of external events or free to define GUI elements, in the below case a sequencer starts to play with a fade in, if a button is pressed.

.. contact us with your project requirements, we are happy to assist planning and realizing it with our servers.