Rendering to Multiple Displays with nDisplay

Describes how to use the nDisplay system to render your Unreal Engine Project simultaneously on multiple displays.

Windows
MacOS
Linux
On this page

Interactive content isn't limited to being displayed on a single screen, or even a single dual-screen device like a VR headset. An increasing number of visualization systems aim to immerse the viewer more effectively in the game environment by rendering real-time content through multiple simultaneous displays. These systems may be made up of multiple adjacent physical screens, such as a Powerwall display; or they may use multiple projectors to project the 3D environment onto physical surfaces like domes, tilted walls, or curved screens, such as in a Cave virtual environment.

The Unreal Engine supports these usage scenarios through a system called nDisplay. This system addresses some of the most important challenges in rendering 3D content simultaneously to multiple displays:

  • It eases the process of deploying and launching multiple instances of your Project across different computers in the network, each rendering to one or more display devices.

  • It manages all the calculations involved in computing the viewing frustum for each screen at every frame, based on the spatial layout of your display hardware.

  • It ensures that the content being shown on the various screens remains exactly in sync, with deterministic content across all instances of the Engine.

  • It offers passive and active stereoscopic rendering.

  • It can be driven by input from VR tracking systems, so that the viewpoint in the displays accurately follows the point of view of a moving viewer in real life.

  • It is flexible enough to support any number of screens in any relative orientation, and can be easily reused across any number of Projects.

nDisplay was an integral part of the visuals for Childish Gambino's award-winning 2018 Pharos show. See the project spotlight video below!

nDisplay System Overview

Every nDisplay setup has a single master computer, and any number of additional computers.

  • Each computer in the network runs one or more instances of your Project's packaged executable file.

  • Each Unreal Engine instance handles rendering to one or more display devices, such as screens or projectors.

  • For each of the devices an instance of Unreal Engine handles, it renders a single viewpoint on the same 3D scene. By setting up these viewpoints so that their location in the 3D world matches the physical locations of the screens or projected surfaces in the real world, you give viewers the illusion of being present in the virtual world.

  • The master node is also responsible for accepting input from spatial trackers and controllers though connections to Virtual-Reality Peripheral Networks (VRPNs), and replicating that input to all other connected computers.

nDisplay network overview

The image above shows a possible nDisplay network. Like all nDisplay networks, one of its PCs acts as the master node. This master node accepts input into the system from a VRPN server, which relays signals that come from spatial tracking devices and other controller devices. The network also contains several other PCs that run other instances of the Unreal Engine Project. Each of these cluster nodes drives one or more display projectors.

One instance, one device

One application instance that renders to one display device.
This is the most straightforward way to set up a host for nDisplay. For each projector or display device you need, you set up one computer to handle rendering to that device. On that computer, you run one instance of the Unreal Engine. Typically, in this scenario, you'll set up that application instance to render a single rectangle of 3D space into a single viewport.

Multiple instances, multiple devices

Multiple application instances that each renders to one display device.
If you have computers in your network that have multiple graphics cards and that can handle driving multiple display devices, you can run multiple instances of the Unreal Engine on those computers. You'll direct each instance of the Unreal Engine to render a different rectangle of 3D space, and dispatch each to a different graphics card.
This approach can work well if the CPU and memory requirements of your Project are light enough that you can run multiple instances on the same computer.

One instance, multiple devices

One application instance that renders to multiple display devices.
With this option, you run a single instance of your Unreal Engine application on your computer, but you set it up to render multiple separate rectangles of the scene's 3D space into different areas within a large window. You then use a technology such as NVIDIA Mosaic or NVIDIA Surround to split up that single large window and render each separate area on a different display devices.
In this scenario, the benefit of using nDisplay in conjunction with Mosaic or Surround is that you can render viewports that accurately match the physical arrangements of your different monitors, even when those monitors do not lie along the same plane. However, all viewports are being rendered by the same instance of Unreal Engine, so you can't benefit from having multiple graphics cards installed on the same computer.

nDisplay Components

nDisplay adds several components to the usual Unreal system architecture:

  • A Plugin that works inside the Unreal Engine. It communicates and synchronizes information between all the application instances that make up the cluster, makes sure all instances render the same frame at the same time, makes sure each display device renders the correct frustum of the game world, and more.

  • A network provisioning and management application, called nDisplayLauncher. You run this application on a single computer in your network to automatically launch and quit your packaged Unreal Engine application on all the computers in your network.

  • A separate listener application, called nDisplayListener, that runs on each computer. This application listens for incoming requests from the nDisplayLauncher, and processes those requests on the local computer.

  • A shared configuration file that contains all the settings nDisplay needs to start up the right number of instances on the right computers, each rendering the right points of view on the game's 3D world to produce the illusion of a seamless rendering across all display screens or projectors. See About the nDisplay Configuration File below.

Getting Started

This section describes how to get up and running with nDisplay for the first time.

Prerequisites:

  • Make sure you have your physical equipment — screens, projectors, and so on — set up and working correctly.

  • Make sure that the Windows account you intend to use on the master computer has administrative rights on all computers you intend to use in the nDisplay network.

  • Make sure that all computers you want to use in the nDisplay network can receive TCP/IP communications over ports 41000, 41001, 41002, and 41003. (You can use different ports instead; see Changing Communication Ports below.)

Step 1 - Set up Your Project for nDisplay

The easiest way to get a Project set up to use nDisplay is to create it from the nDisplay Template Project:

Create an nDisplay Project from the Template

This automatically enables the nDisplay Plugin, adds some sample configuration files to your Project, and provides a default Level that is already configured with default settings.

If you have an existing Project that you want to use with nDisplay, you can do the same configuration by hand. See Add nDisplay to an Existing Project below.

Step 2 - Set up the Configuration File

You need to tell nDisplay about the different computers you want to use in your network, the size and resolution of the screens or projectors those computers will render to, the spatial relationships between those screens in 3D space, and more. To do this, you create a configuration file that expresses all of this information in a series of settings.

Creating this configuration file is likely to be the trickiest part of your nDisplay setup, so you should approach it with care. For details, see About the nDisplay Configuration File below.

Typically, once your configuration file is set up, you only have to modify it when the topology of your network changes: for example, when you need to change the computers you are rendering to, or if you change the physical arrangement of your screens and displays in the real world.

Save your configuration file inside your Project's Content folder. You'll use it in the next step.

Step 3 - Package and Deploy

Every time you change the content in your Project, you'll need to package your game and deploy it to all of the computers that you've identified in your configuration file.

  1. In the Unreal Editor, package your game for Windows. For details, see Build Operations: Cook, Package, Deploy & Run.

  2. Find the Engine\Binaries\DotNET\nDisplayListener.exe application under your Unreal Engine installation folder. Copy this application to the folder that contains the .exe file you've packaged for your game.

  3. Copy the folder that contains the packaged .exe file for your Project and the nDisplay Listener application to each computer that you have identified as a cluster node in your configuration file.
    You must place the folder at exactly the same path on all computers.

  4. On each computer, launch the nDisplayListener.exe file.

Now you have everything nDisplay needs deployed to each computer in your cluster: the packaged version of your application, the configuration file that defines the setup of your nDisplay cluster, and a running nDisplay Listener that is waiting for incoming instructions from the nDisplay Launcher application. In the next section, you'll use the nDisplay Launcher to instruct each listener to launch your Project on its own cluster node.

Step 4 - Launch all Cluster Nodes

Once you have your Project deployed successfully to all the computers you've identified in your configuration file, you can use the nDisplayLauncher application to start the Project on all computers simultaneously.

  1. If you don't already have the nDisplayLauncher application running, start it.

  2. Add your packaged Project .exe file to the Applications list.
    Click Add under the Applications list, then browse to and select the .exe file you packaged for your Project. The nDisplay Launcher will add your new application to the list. Click its name to select it.

  3. Specify your configuration file.
    Click Add to the right of the Config Files control, then browse to and select your configuration file.

  4. Click Run.

The nDisplay Launcher dispatches a message to the nDisplay Listener on each cluster node in your configuration file, instructing it to launch the packaged Project. You should see each nDisplay Listener display the command in its status window:

nDisplay Listener receiving the Run command

Then, the nDisplay Listener on each host should launch your Unreal Engine application, which begins rendering the viewport or viewports that you've configured for its main window.

When you're done, click Kill to automatically shut down all instances of the Unreal Engine on all computers, or simply shut down the instance of Unreal Engine that is running on the master computer.

About the nDisplay Configuration File

The best way to get started understanding the nDisplay configuration file, and creating your own, is to start from the example configurations provided by the nDisplay plugin. If you've created your Project from the nDisplay template, you'll find these files in your Project folder, under Content/ConfigExamples. If not, you can find these files in the Unreal Engine installation folder, under Templates/TP_nDisplayBP/Content/ConfigExamples.

The structure of the nDisplay configuration file is directly tied to the different types of components it uses to render the visualization.

  • Each different type of component that you configure has its own line in the file, and is identified by a string ID that you assign. You use these string IDs when one configuration section needs to refer to another.

  • Many of the components that you configure in this file have defined positions (and often rotations) in virtual 3D space. Each object's position and rotation is relative to that object's parent. By default, the parent of all objects is the VR space origin: an arbitrary point in 3D world space where VR space is considered to start. You can also configure specific named transforms in 3D space, called scene_nodes, which can act as parents for one or more components. This can help simplify the spatial layout of your screens, cameras and other components.
    To see how you can use scene_nodes to build a hierarchy of 3D transforms that all start from the same point in virtual space, see the Configuration File Example section below.

  • All parameters that refer to measurements in virtual 3D space or real-world physical space expect values in meters and degrees, unless otherwise indicated. This includes screens, scene nodes, cameras, and so on.

  • All parameters that refer to measurements in screen space expect values in pixels. This includes windows and viewports.

Cluster Node Configurations

For each different instance of your Unreal Engine application that you'll use in your nDisplay network, you need to define a cluster_node configuration. Each cluster_node configuration must contain a reference to a window configuration section that defines the properties of the main application window.

The cluster_node configuration also defines the hostname or IP address of the computer that will run that application instance. You may set up a different physical computer for each cluster_node configuration, or you may have multiple cluster_node configurations that run on the same host.

Example configuration:

This example configures a master node (one per network):

[cluster_node] id=node_front addr=192.168.0.1 window=wnd_LT sound=true port_cs=41001 port_ss=41002 master=true

This example shows a non-master cluster node:

[cluster_node] id=node_left addr=192.168.0.2 window=wnd_large sound=false

Parameters:

Parameter

Description

id

A unique name for this cluster node configuration.

addr

The IP address of the computer that will run this instance of Unreal Engine. This must be an IPv4 address. IPv6 is not supported.

window

The name of the window configuration that defines the size and position of the main window for this instance of your Unreal Engine application.

sound

Determines whether this instance of Unreal Engine plays sound. Optional; default value is false.

port_cs port_ss port_ce

Ports that the master node uses to communicate with other nodes in the cluster. port_cs is for cluster synchronization; port_ss is for swap synchronization; port_ce is for cluster events. Optional; default values are 14001, 14002, and 14003.

master

Determines whether this instance of Unreal Engine is the master node of the cluster. Only one cluster_node section can have this parameter set to true. Optional; default value is false.

eye_swap

Determines whether or not the images generated for the left and right eye are swapped. Optional; default value is false.

Window Configurations

Each window configuration defines a set of properties for the main window of an instance of your Unreal Engine application. You use it to configure things like the starting size and placement of the window when nDisplay launches your application, and whether or not the window should take up the full screen.

You also provide one or more viewport configurations, which identify specific areas within the main application window that nDisplay will fill with renderings of your scene.

Example configuration:

This example configures an application window that contains a single viewport:

[window] id=wnd_one fullscreen=false WinX=0 WinY=0 ResX=640 ResY=480 viewports=vp_LT

This example configures an application window that contains four separate viewports:

[window] id=wnd_four fullscreen=false WinX=0 WinY=0 ResX=640 ResY=480 viewports="vp_LT,vp_RB,vp_LB,vp_RT"

Parameters:

Parameter

Description

id

A unique name for this window configuration.

fullscreen

Determines whether this window should run in fullscreen mode or not. If you set this value to false, you must provide the WinX, WinY, ResX, and RexY settings described below.

Winx WinY

Specifies the position of the top left corner of the application window on the desktop, in pixels of screen space.

ResX ResY

Specifies the size of the application window in pixels of screen space.

viewports

Refers to one or more viewport configuration sections that define the areas of the main application window that nDisplay should fill with rendered views of the scene.
If you specify more than one viewport, you must use a comma-separated list of viewport configuration section names, enclosed in quotes. See the wnd_four example under the Example configuration section above. The order of this list does not affect the visual order or placement of the viewports. The placement of the viewports within the parent window are defined in the named viewport configurations.

Make sure that the viewport definitions you use do not exceed the size of the window.

Viewport Configurations

Each window configuration described above refers to one or more viewport configurations, each of which defines a rectangular area of the game window that nDisplay should fill with a rendered view of the scene.

Usually, a viewport starts at the upper left corner of the application window, and its width and height are set so that they fill the parent window. However, in some cases you may need to offset the viewport within its parent application window. For example, you might want to do this if you need to set up two projectors that partially overlap, or if you need one application window to host multiple separate viewports at different positions.

Example configuration:

[viewport] id=vp_LT X=0 Y=0 width=300 height=220 screen=scr_LT

Parameters:

Parameter

Description

id

A unique name for this viewport configuration.

X Y

The coordinates of the top left corner of the viewport, in pixels, within the screen space of the main application window. Note that these values are relative to the top left corner of the application window, not relative to the top left corner of the screen itself.

width height

The width and height of the rendered frame, in pixels. This should not be larger than the size of the game window that is set by the size parameter of any window configuration using this viewport.

screen

The name of the screen configuration that defines the frustum of 3D space that the Unreal Engine application should render into this viewport.

Screen Configurations

Each different output display renders the scene from the current camera's position, using a frustum that is defined by a rectangle with a defined size and placement in the 3D VR space. Each of these rectangles is defined by a screen configuration. Usually, each of these projection screens has the same dimensions in VR space as the physical screen that you'll use to render it.

The pivot point of a screen is always in its exact midpoint.

Example configuration:

This definition describes a screen that is 3 meters by 3 meters screen, directly in front of its parent. Because the pivot point of the screen is at the center of the rectangle defined by the size parameter, we add a 1.5 meter offset on the Z axis to move the screen upward by half its height.

[screen] id=screen_front loc="X=1.5.Y=0,Z=1.5" rot="P=0,Y=0,R=0" size="X=3,Y=3" parent=screens

To define a screen on the left side of the viewer, we move it to the left (negative values on the Y axis), and rotate it around its local Y axis (yaw), 

[screen] id=screen_left loc="X=0,y=-1.5,Z=1.5" rot="P=0,Y=-90,R=0" size="X=3,Y=3" parent=screens

Parameters:

Parameter

Description

id

A unique name for this screen configuration.

loc

The location of the center of this screen in VR space, relative to its parent.

rot

The pitch (P), yaw (Y) and roll (R) angles of the screen's facing direction, in degrees.

size

The total size of the screen rectangle along its local X and Y axes, in meters.

parent

The name of a scene_node configuration that you want to act as the parent for this object. This parameter is optional. If you specify a parent, the values you set in the loc and rot parameters will be relative to the position of that parent. If you omit the parent, the values you set in the loc and rot parameters will be relative to the VR root.

Camera Configurations

All instances in the nDisplay cluster render the scene from the same position in the virtual world. Each of these potential viewpoints is defined by a camera configuration line.

You can switch between these viewpoints at runtime. Each camera viewpoint can also be driven by a tracking device.

Example configuration:

[camera] id=camera_static loc="X=0.Y=0,Z=1.7" tracker_id=VRPNTracking tracker_ch=0

Parameters:

Parameter

Description

id

A unique name for this camera configuration.

loc

The location of this camera in VR space, relative to its parent.

tracker_id

The name of the input configuration that defines the VR device you want to drive the position of the camera over time. Optional. If you omit this parameter, the camera's position will be static in VR space.

tracker_ch

When you provide a tracker_id, this parameter specifies the channel of that device that nDisplay will read tracking data from.

parent

The name of a scene_node configuration that you want to act as the parent for this object. This parameter is optional. If you specify a parent, the values you set in the loc parameter will be relative to the position of that parent. If you omit the parent, the values you set in the loc parameter will be relative to the VR root.

Scene Node Configurations

In your configuration file, you can define a hierarchy of scene nodes, each of which represents a transform in 3D space. Anything that you set up in the configuration file that requires a position and rotation in 3D space, such as a camera or a projection screen, can use one of these scene_node configurations as its parent. This can help you to define the spatial relationships between all of the different components of the visualization system.

Like cameras, scene nodes can also be driven by VR tracking devices.

Example configuration:

The following lines define a hierarchy of two nodes, where the child node has an offset of 2 meters in front of its parent.

[scene_node] id=vr_space_root loc="X=0.Y=0,Z=0" rot="P=0,Y=0,R=0"
[scene_node] id=walls_front_group loc="X=2.Y=0,Z=0" rot="P=0,Y=0,R=0" parent= vr_space_root

The following line shows a scene node that is configured to be driven by a VR tracking device:

[scene_node] id=cave_wand loc="X=0, Y=0,Z=1" tracker_id=CaveTracking tracker_ch=1

Parameters:

Parameter

Description

id

A unique name for this scene node configuration.

loc

The location of this scene node in VR space, relative to its parent.

rot

The pitch (P), yaw (Y) and roll (R) angles of the scene node's facing direction, in degrees.

parent

The name of another scene_node configuration that you want to act as the parent for this scene node. This parameter is optional. If you specify a parent, the values you set in the loc and rot parameters will be relative to the position of that parent. If you omit the parent, the values you set in the loc and rot parameters will be relative to the VR root.

tracker_id

The name of the input configuration that defines the VR device you want to drive the position of the scene node over time. Optional. If you omit this parameter, the scene node's position and rotation will be static in VR space with respect to its parent.

tracker_ch

When you provide a tracker_id, this parameter specifies the channel of that device that nDisplay will read tracking data from.

Input Configurations

You define an input section for each device that you need to provide input to the nDisplay system. For example, each camera and each scene_node may optionally be driven by a VR tracking device that you set up in an input section and refer to in the camera or scene_node configuration. Alternatively, you may want to set up trackers, controllers, and keyboards to send generic input events to the Unreal Engine input system, or bind their events and input values to generic nDisplay Blueprint nodes that you can respond to in your Project's gameplay scripts.

You can also use input_setup sections to control the way specific channels, buttons, or keys from these input devices are bound to specific types of input events and values within Unreal Engine.

For an overview of what you can do with nDisplay inputs, see Using VRPN Inputs.

Example configuration:

This configuration sets up nDisplay to get input from a VRPN location tracking device. Typically this kind of device is mounted to a camera or a viewer's head, or is held by a viewer. You can automatically drive the position of a camera or scene node from the trackerby referring to this input configuration in a camera or scene_node configuration. Or, you can retrieve the value of this tracker in your Project's Blueprint code.

[input] id=CaveTracking type=tracker addr=Tracker0@192.168.0.1 loc="X=-1.5,Y=0,Z=3.4" rot="P-0,Y=0,R=0" front=X right=Y up=-Z

This configuration sets up nDisplay to read keyboard input from a keyboard that is set up as a VRPN device, and to route that input through the built-in Unreal Engine keyboard inputs.

[input] id=ControlKeyboard type=keyboard addr=Keyboard0@192.168.0.1 reflect=ue4

Parameters:

Parameter

Description

id

A unique name for this input device configuration.

type

The type of this VRPN input device:

  • tracker for a tracking device.

  • analog for a device that produces axis data.

  • button for a device that produces Boolean button data.

  • keyboard for a standard computer keyboard.

addr

The address of the VRPN server that handles this particular device. The value must match the following format:
DEVICENAME@SERVER_ADDRESS:SERVER_PORT
where:

  • DEVICENAME is the VRPN name for this device.

  • SERVER_ADDRESS is the IPv4 address of the VRPN server.

  • :SERVER_PORT is the port the VRPN server listens on for incoming connections.
    This is optional. If you don't provide it, nDisplay uses port 3883 by default.

Devices where type=tracker also accept the following additional parameters:

Parameter

Description

loc rot

Similar to other configuration sections, the loc and rot parameters specify position and rotation offsets in local space for this input device. However, for an input device, you typically use these offsets to adjust the root position of a tracking device in VR space to match the location you expect it to be in your scene node hierarchy.

front right up

These parameters match each local axis of the tracker in Unreal (front, right, and up) with the corresponding axis in the tracker's coordinate system. Unreal uses a right-handed, Z-up coordinate system. If your tracker uses a different coordinate system, you can use these parameters to map the tracker's coordinate system to Unreal's.
For example, the following line maps the Y axis of the tracker to the front (X) axis in Unreal; the X axis of the tracker to the right (Y) axis in Unreal, and the negative Z axis of the tracker to the up (Z) axis in Unreal:
front=Y right=X up=-Z

Devices where type=keyboard also accept the following additional parameter:

Parameter

Description

reflect

Determines how the inputs from this keyboard are passed in to the Unreal Engine, and how you can respond to those events.
This setting accepts any one of the following values:

  • nDisplay

  • ue4

  • both

  • none

For more information, see Reflecting Keyboard Events.

Input Setup Configurations

Each input_setup configuration section provides additional configuration parameters for a specified input device, typically to bind a channel or key from that device to a generic nDisplay Blueprint input node.

Example configuration:

This configuration sets up the input device with ID controller so that when a button is pressed that generates an event on channel 0, an event is generated from the Input > N Display Events > nDisplay Button 0 node in Blueprint.

[input_setup] id=controller ch=0 bind="nDisplay Button 0"

This configuration is similar to the above, except that it binds an analog value (typically an axis from a controller) to an nDisplay analog value. You can use the Input > N Display Events > nDisplay Analog 0 node in Blueprint to detect when that controller axis is used, or Input > N Display Values > nDisplay Analog 0 to retrieve the value for the current frame.

[input_setup] id=test_axes ch=0 bind="nDisplay Analog 0"

If you're using a keyboard device, you don't have to specifically bind each of its keys. Instead, you simply use the reflect setting in the input section to determine whether the key events should be routed to built-in Unreal Engine keyboard events, or to nDisplay keyboard events. However, if you want to change a binding or add a new binding, you can. For example, this section makes the space bar trigger the Input > N Display Events > nDisplay Button 3 event.

[input_setup] id=keyboard0 key=Space bind="nDisplay Button 3"

Parameters:

Parameter

Description

id

Refers to the ID of the input configuration that this input_setup section configures.
Note that unlike most other sections in the nDisplay configuration file, this id value does not provide an ID for the input_setup section that contains it. Instead, it refers to the ID of an input section defined elsewhere in the file.

ch

Determines the channel of the specified input device that will be bound to the event that you set in the bind setting.

key

Similar to ch, but used only for input devices where type=keyboard.

bind

Determines the event in Unreal Engine that the channel or key specified above is bound to. This value can be the name of any Blueprint node that you see in the Input category, such as F1, nDisplay F1nDisplay Button 0, Gamepad Left Thumbstick X-Axis, Gamepad Face Button Top, and so on.
If the name contains a space, you must enclose it in double-quotes.

You can also set up these channel and key bindings in your Project's Blueprint code, using the nodes in the input module API. For details, see Binding Device Channels to UE4 Inputs.

General Configuration

The general configuration line contains parameters that control the overall operation of the nDisplay cluster.

Example configuration:

[general] swap_sync_policy=1

Parameters:

Parameter

Description

swap_sync_policy

Determines how output is synchronized over the network.

  • 0: no synchronization.

  • 1: Software swap synchronization

  • 2: NV swap lock (only for NVIDIA cards rendering with OpenGL)

Stereo Configuration

The stereo configuration line sets optional global parameters for stereoscopic rendering.

Example configuration:

[stereo] eye_dist=0.064

Parameters:

Parameter

Description

eye_dist

The inter-ocular distance to use for offsetting the images generated for the left and right eyes, in meters.

Network Configuration

The network configuration section provides settings that you can use to control timeouts and other settings related to network communication between nDisplay cluster nodes.

You can only have zero or one network section in your nDisplay configuration file.

Example configuration:

[network] cln_conn_tries_amount=10 cln_conn_retry_delay=1000 game_start_timeout=30000 barrier_wait_timeout=5000

Parameters:

Parameter

Description

cln_conn_tries_amount

When a non-master cluster node starts up, this setting determines the number of times the node will attempt to connect to the master.
Optional; the default value is 10.

cln_conn_retry_delay

When a non-master cluster node starts up, this setting determines the time interval between each successive attempt by the node to connect to its master, in milliseconds.
Optional; the default value is 1000.

game_start_timeout

Sets a time interval that each Unreal Engine application on each cluster node will wait before it starts the first frame of the game loop and begins rendering to the main window, in milliseconds. This gives all your cluster nodes a chance to connect to the master before rendering begins. During this time, the main window will be black. If, at the end of this time interval, any node has not yet successfully connected, all instances in the cluster will shut down.
Optional; the default value is 30000. You may need to raise this value if your cluster takes an unusually long time to initialize.

barrier_wait_timeout

Sets the barrier timeout for the game and render threads, in milliseconds. This is a barrier timeout to synchronize both game and render threads among cluster nodes. It’s used several times within each frame. In other words, this is used at runtime to detect situations where any node becomes unreachable. If this occurs, the state of the cluster state is determined to be invalid, and all nodes shut themselves down.
Optional; the default value is 5000.

The cln_conn_tries_amount and cln_conn_retry_delay settings work together to determine the maximum length of time your cluster nodes will try to connect to the master node at startup. For example, suppose you have cln_conn_tries_amount set to 10, and cln_conn_retry_delay set to 1000 milliseconds. On startup, each node tries to connect to the master. If that connection fails, it waits 1000 milliseconds to try again. If that attempt also fails, it waits another 1000 milliseconds. After ten successive failures, the cluster node quits automatically. As soon as a cluster node makes the connection to its master, the count stops.

Info Configuration

The info configuration line contains optional information about the latest version of nDisplay and Unreal Engine that this configuration file is known to be compatible with.

Example configuration:

[info] version=22

Parameters:

Parameter

Description

version

The latest version of nDIsplay and Unreal Engine that this configuration file is known to be compatible with.
The number should be understood as the point version that follows 4.. For example, a value of 22 means that the file is compatible with version 4.22 of nDisplay and Unreal Engine.

Do not set this value by hand. The nDisplay Launcher sets it automatically. If you use a configuration file without a version, or where the version in the file is lower than the version of nDisplay and Unreal Engine you are using, the nDisplay Launcher automatically attempts to update your configuration file to work with the latest version. If it succeeds, it saves the updated configuration to a new file and updates this value to the latest version.

Configuration File Example

To take a specific example, open the wall_flat_3x2.cfg sample file. This file defines six projection screens, each of which is to be rendered by a separate physical computer.

It also defines several scene_nodes, which taken together create the following hierarchy:

nDisplay example scene hierarchy

The relative positions and rotations of the nodes in this hierarchy lay out the arrangement of the camera and the six screens in VR space so that the six projection screens are side-by-side, at a distance of 1 meter from the camera. 

Note that the configuration implies a small space in between each adjacent pair of projection screens, to account for the edges of the monitors rendering the scene.

Blueprint API

You can control the behavior of the nDisplay system in your game's runtime logic using its Blueprint APIs.

To get to the functions exposed in these APIs:

  1. For most nDisplay Blueprint functions related to cluster management, querying input devices, nDisplay rendering, and more, create a new N Display > Get DisplayCluster Module API node in your Blueprint.
    For functions that set up bindings and reflection between VRPN input devices and Unreal Engine input events, create a new N Display > Get DisplayClusterInput Module API node in your Blueprint. See also Binding Device Channels to UE4 Inputs.

  2. Drag from the Out API pin of your node, and look under the Display Cluster or Display Cluster Input category:

Actor Replication

All inputs to the nDisplay system are handled only by the master node. Without any replication, only the master node would see changes in the scene. Therefore, the master node needs to be able to replicate changes to all other parts of the nDisplay network.

To accomplish this, nDisplay offers two different kinds of Components that you can attach to your Actors:

  • The DisplayClusterSceneComponentSyncParent Component tracks changes in the 3D transforms of its parent Component, and pushes those changes to the other cluster nodes in the network.
    The default DisplayClusterPawn used by the nDisplay system uses this Component.

  • The DisplayClusterSceneComponentSyncParent Component tracks changes to the 3D transforms of its child components, and pushes those changes to the other cluster nodes in the network.

For example, in the Actor shown below, the DisplayClusterSceneComponentSyncParent_DefaultSceneRoot Component tracks and replicates changes to the 3D transforms of its parent Actor as the Actor moves around the Level. The DisplayClusterSceneComponentSyncThis Component tracks and synchronizes movements of its child Cube component as it moves relative to the scene graph root.

DisplayClusterSceneComponentSyncParent

If you have other Actors in your scene that can be affected during gameplay, you must use one of these two Components to replicate those changes to all nodes. To do this:

  1. Select the Actor you need to replicate in the Level viewport or the World Outliner panel.

  2. In the Details panel, click + Add Component. Search for either DisplayClusterSceneComponentSyncParent or DisplayClusterSceneComponentSyncThis, and select it from the list.
    Add an nDisplay sync Component

These components do not carry out a full replication. Only the transforms of the parent Actor or of child Components are sent to the cluster.

Using VRPN Inputs

To use a VRPN input device with nDisplay:

  1. Install a VRPN server on your network.
    This version of nDisplay requires VRPN version 7.33.

  2. In the server’s vrpn.cfg file, which you'll find located next to the server's executable file, enable your input device and give it a name.

  3. In your nDisplay configuration file, add an input entry to set up your VRPN input device.

Depending on the kind of input device you need to set up, and how you want to apply the input from that device to your Unreal Engine Project, you may need to use different settings for your input configuration and carry out some additional steps. See the following sections for details on all the different options you have.

Mapping a VRPN Tracker to an nDisplay Camera or Scene Node

If you have a VRPN tracker device, you can map its current position directly to any nDisplay camera or scene node that you have set up in your nDisplay configuration file. As you move the tracker around in real space, the location of that camera or scene node will automatically update in virtual space to follow.

You can do this in your nDisplay configuration file, by setting up the input and camera sections.

The following example shows one way to set up the ART DTrack tracking system:

  • In Vrpn.cfg, located next to the VRPN server executable, add the following line:

    vrpn_Tracker_DTrack DTrack  5000

    This makes VRPN receive the DTrack inputs from port 5000, and maps them to the VRPN device named DTrack. (Make sure DTrack is configured to output its tracking data on port 5000.)

  • In the nDisplay config file, add the following lines:

    [input] id=CaveTracking type=tracker addr=DTrack@127.0.0.1 loc="X=1.32,Y=0,Z=0.93735" rot="P=0,Y=0,R=0" front=Z right=-X up=Y
    [camera] id=camera_dynamic loc="X=0,Y=0,Z=0" tracker_id=CaveTracking tracker_ch=0

    The first line creates an nDisplay input device named CaveTracking, which fetches data from VRPN address DTrack@127.0.01. You have to adapt the coordinate system here to fit with your tracking system and offset. The second line, the camera configuration, tells nDisplay to fetch camera positions from the CaveTracking input on channel 0.

Reflecting Keyboard Events

If you have a keyboard device set up for your VRPN server, you can control how the events from that keyboard (when keys are pressed and released) are reflected in your Unreal Engine Project. You can make the keyboard keys trigger the standard UE4 keyboard input system, you can direct the keys trigger new keyboard inputs provided by nDisplay, you can do both, or neither.

You can set this up in either of two equivalent ways:

  • You can do it in your nDisplay configuration file, by setting the reflect option for the input section that defines your keyboard device:

    [input] id=ControlKeyboard type=keyboard addr=Keyboard0@192.168.0.1 reflect=ue4
  • Or you can do it in your Project's Blueprint code, by calling the Set VRPN Keyboard Reflection (Interface Call) function:
    Keyboard reflection in Blueprint

The reflection setting accepts any one of the following values:

[input] setting

Blueprint option

Description

nDisplay

nDisplay buttons only

Keyboard events are routed to new keyboard input events created by nDisplay. You can respond to these events in your Blueprint scripts using the nodes in the Input > N Display Keyboard Events category.

ue4

Native UE4 keyboard events

Keyboard events are routed through the input system built in to Unreal Engine. You can respond to these events in the InputController class you use in your application, or you can respond to them in your Blueprint scripts using the nodes in the Input > Keyboard Events category.

both

Both nDisplay and UE4 native

Keyboard events are routed through both the nDisplay keyboard handling system and the input system built in to Unreal Engine. You can respond to their events using of the methods described in either of the rows above.

none

No reflection

Keyboard events are not routed through either the nDisplay keyboard handling system or the built-in input system.
If you use this option, you'll need to either:

  • Query for specific events using Blueprint nodes from the nDisplay Blueprint API, like Display Cluster > Input > Was VRPN Button Pressed or Display Cluster > Input > Was VRPN Button Released. See also Querying for Device Inputs.

  • Bind individual keys to other input events in UE4. For details, see Binding Device Channels to UE4 Inputs.

When you set up reflection for a keyboard device, your setting applies to all keys on that device. However, you can still re-bind individual keys to other input events in UE4. For details, see Binding Device Channels to UE4 Inputs.

Binding Device Channels to UE4 Inputs

You can make your Project respond to VRPN input devices by binding specific channels from your VRPN devices to Unreal Engine events and motion sources. You can create these bindings in either of two equivalent ways:

  • By setting up an input_setup section in your nDisplay configuration file for each channel you want to bind to a motion source or event.

  • By using the functions available in the nDisplay input module API:
    Bind VRPN device channel

Example: Binding a Tracking Device to a Motion Source

You can bind a VRPN motion tracking device to any existing Motion Source in Unreal Engine. You can then use that Motion Source to drive a MotionControllerComponent that you assign to an Actor in your Level.

To set this up:

  1. You'll need your nDisplay configuration file to have an input section that defines the tracker. For example:

    [input] id=TestTrack type=tracker addr=Tracker0@127.0.0.1 loc="X=0,Y=0,Z=0" rot="P=0,Y=0,R=0" front=X right=Y up=Z
  2. You'll also need to bind the device and channel that you want to track to the Motion Source that you want to receive its input.
    You can do this by adding an input_source section to your configuration file:

    [input_setup] id=TestTrack ch=0 bind="Special_1"

    Or you can do it in your Project's Blueprint code, by calling the Bind VRPN Tracker (Interface Call) function and setting the same values:
    Bind VRPN Tracker

Example: Binding an Analog Device

An analog VRPN device gives input values that range from 0 to 1, similar to a mouse or thumbstick input in Unreal Engine.

nDisplay includes a set of 20 generic analog inputs that you can bind your analog VRPN devices to. You'll find them under the Input > N Display Events and Input > N Display Values categories.

nDisplay generic analog events and values

You don't have to use these nDisplay analog inputs; you can also bind the VRPN device to other UE4 inputs. The following example shows both.

To set this up:

  1. You'll need your nDisplay configuration file to have an input section that defines the analog device. For example:

    [input] id=TestAxes type=analog addr=Mouse0@127.0.0.1
  2. You'll also need to bind the device and channel that you want to track to the analog input that you want to receive its input. Often analog devices have two channels — one for input on an X axis and one for the Y axis. In this case, you'll typically want to bind the two axes separately to different nDisplay analog inputs.
    You can do this by adding two input_source sections to your configuration file:

    [input_setup] id=TestAxes ch=0 bind="nDisplay Analog 0"
    [input_setup] id=TestAxes ch=1 bind="Gamepad Left Thumbstick Y-Axis".

    Or you can do it in your Project's Blueprint code, by calling the Bind VRPN Channel (Interface Call) function and setting the same values:
    Bind VRPN Channel for analog device

  3. When you need to detect that an input event occurred, or get the actual value of the input along its axis, use the input events that you've bound your VRPN axis channels to.
    For example, in this case you'd use:

    • for the first axis, Input > N Display Events > nDisplay Analog 0 to respond to an input event, and Input > N Display Values >  nDisplay Analog 0 to retrieve the current axis value.

    • for the second axis, Input > Gamepad Events > Gamepad Left Thumbstick Y-Axis to respond to an input event, and Input > Gamepad Values > Gamepad Left Thumbstick Y-Axis to retrieve the current axis value.

Example: Binding a Button Device

A button VRPN device fires an event on a given channel each time a button is pressed or released.

nDisplay includes a set of 20 generic button event inputs that you can bind your button devices to. You'll find them in the the Input > N Display Events category.

nDisplay generic button events

You don't have to use these nDisplay button inputs; you can also bind the VRPN device to other UE4 inputs. The following example shows both.

To set this up:

  1. You'll need your nDisplay configuration file to have an input section that defines the button device. For example:

    [input] id=TestBtn type=buttons addr=Mouse0@127.0.0.1
  2. You'll also need to bind the device and channel that you want to track to the button input that you want to receive its input.
    You can do this by adding an input_source section to your configuration file:

    [input_source] id=TestBtn ch=0 bind="nDisplay Button 0"
    [input_source] id=TestBtn ch=2 bind="Gamepad Face Button Top"

    Or you can do it in your Project's Blueprint code, by calling the Bind VRPN Channel (Interface Call) function and setting the same values:
    Bind VRPN Channel for button device

  3. When you need to detect that an input event occurred, use the input event that you've bound your VRPN button channel to.
    For example, in this case you would use the Events > N Display Events > nDisplay Button 0 and Events > Gamepad Events > Gamepad Face Button Top nodes.

Example: Binding a Keyboard Device

As described under Reflecting Keyboard Events above, you can make VRPN keyboards map their inputs to built-in Unreal Engine keyboard inputs, to new nDisplay keyboard inputs, both, or neither. Regardless of what type of reflection you set up, you can also bind individual keys from your keyboard to other Unreal Engine or nDisplay input events.

To set this up:

  1. You'll need your nDisplay configuration file to have an input section that defines the keyboard device. For example:

    [input] id=TestKb type=keyboard addr=Keyboard0@127.0.0.1
  2. You'll also need to bind the device and the key that you want to track to the Unreal Engine input event that you want that key to trigger.
    You can do this by adding an input_source section to your configuration file:

    [input_source] id=TestBtn key="Space Bar" bind="Gamepad Left Trigger"

    Or you can do it in your Project's Blueprint code, by calling the Bind VRPN Keyboard (Interface Call) function and setting the same values:
    Bind VRPN Keyboard

  3. When you need to detect that an input event occurred, use the input event that you've bound your VRPN button channel to.
    For example, in this case you would use the Event > Gamepad Events > Gamepad Left Trigger node.

Querying for Device Inputs

Instead of relying on bindings to input events, you can directly query your VRPN devices to find out their current state.

  • In C++, use the IDisplayClusterInputManager class.

  • In Blueprints, use the functions in the nDisplay API under DisplayCluster > Input. Make sure that the input values that you provide for the Device Id and Device Channel in these nodes must match the values that you have set for the device in the input section of your nDisplay configuration file.
    For example:

Keyboard Buttons and Device Channels

A VRPN keyboard input is essentially a specialized type of button device. If you need to query whether a given keyboard button was pressed, use the Was VRPN Button Pressed (Interface Call) function:

Detecting whether a VRPN button was pressed

For this to work, you need to set the Device Channel input to the numeric ID assigned by VRPN to the button that you want to test.

To determine the numeric ID of the Device Channel that corresponds to the button that you want to test on your keyboard, you can run the vrpn_print_devices.exe application that you'll find provided with the VRPN distribution. While this tool is running, it prints to the console the numeric ID of any button you press on your keyboard.

For example, the spacebar is key number 57:

Find the numeric ID of a keyboard button

Using Cluster Events

Cluster Events are a way for you to make all the nodes in your nDisplay cluster respond to events simultaneously.

  1. You generate a Cluster Event either from a node in the cluster or by sending it to the master node from an external application. See Emitting Cluster Events from Blueprints or Emitting Cluster Events from External Applications.
    When you have an nDisplay cluster up and running, you can also use the nDisplay Launcher application to send new Cluster Events for your cluster nodes to respond to. See Emitting Cluster Events from the nDisplay Launcher.

  2. When the master node of your cluster receives a Cluster Event, it propagates that event to each node in the cluster so that the event happens on each node in exactly the same frame.

  3. Within the Blueprint or C++ logic of your Unreal Engine application, you set up listeners to detect these Cluster Events and respond to them with whatever gameplay logic you need for your Project. See Responding to Cluster Events in Blueprints.

Cluster Event Structure

Each nDisplay Cluster Event can contain several properties:

Setting

Type

Name

string

Type

string

Category

string

Parameters

An optional map of key-value pairs, where the keys and values are both strings.

It's up to you to decide in your Project what data you want to send in each of these properties, and how you want your listeners to interpret that data.

When you interact with a Cluster Event in Blueprint, you'll use the Make DisplayClusterClusterEvent and Break DisplayClusterClusterEvent nodes to construct and deconstruct Cluster Events. For example:

A Cluster Event in Blueprint

In C++, or when emitting Cluster Events from your own applications, you'll use JSON to express the same structure. For example, the equivalent JSON for the Cluster Event above is:

{"Name":"activate","Type":"command","Category":"particles","Parameters":{"rate":"200","speed":"3"}}

Emitting Cluster Events from Blueprints

To emit a Cluster Event from a Blueprint class in your Project:

  1. Get the DisplayCluster Module API (see Blueprint API above), and call its Cluster > Emits cluster event (Interface Call) function. This node fires the Cluster Event out to the master node, which propagates it back to all the nodes in the cluster.
    Emits cluster event

  2. By default, every instance of your Unreal Engine application that evaluates this Blueprint node in its gameplay logic will fire this Cluster Event. If this Blueprint graph gets evaluated on many different nodes in your cluster, this can cause multiple copies of the event to happen.
    To avoid triggering multiple copies of the Cluster Event, you can set the Master Only Boolean value on the Emits cluster event node. If you check this box, only the master node will emit this Cluster Event. If any other non-master cluster node evaluates the same Blueprint graph, those nodes will not emit the event.
    Master Only

  3. Drag left from the Event port on the Emits cluster event node, and choose Make DisplayClusterClusterEvent.
    Make DisplayClusterClusterEvent

  4. Use the settings in the Make DisplayClusterClusterEvent node to set set up your Cluster Event with string values for its Name, Type, and Category. If you need to pass arbitrary key-value data along with your cluster event, you can also pass a map of those keys and values to the Parameters input.
    Creating and emitting a Cluster Event

  5. Compile and Save your Blueprint.

The next time you repackage your Project and relaunch your nDisplay cluster, this Blueprint code will fire the Cluster Event you've set up. To respond to this event elsewhere in your Blueprint code, see Responding to Cluster Events in Blueprints.

Emitting Cluster Events from External Applications

When you start up your nDisplay cluster, the master node begins listening for incoming Cluster Events on a specific local port. You can emit new Cluster Events to your nDisplay system from another application running on any other computer in your network by connecting to that port and sending messages.

For each Cluster Node you want to emit, your message must follow this convention:

  • The first two bytes must give the total length of the rest of the message.

  • The rest of the message should be the content of your Cluster Event, expressed as a JSON object.

For example, to emit a Cluster Event with the name "quit" and the type "command", you would need to:

  1. Construct a JSON string that contains the values for your Cluster Node. In this case:

    {"Name":"quit","Type":"command","Category":"","Parameters":{}}

    The Name, Type, and Category fields are mandatory, but you may omit the Parameters field.

  2. Get the length of the JSON string — in this case, 62 characters — and send that length to the nDisplay master node.

  3. Send the JSON string itself to the nDisplay master node.

If the master node is able to receive and process your message, it sends a response back to your application that follows the same convention:

  • The first two bytes tell you the length in bytes of the rest of the response message.

  • The rest of the message is a JSON object that typically contains one field: Error. The value of this field is an error code that indicates whether your message was handled correctly:

    Code

    Meaning

    0

    No error occurred. The message was processed successfully.

    2

    The Cluster Event you sent is missing one or more mandatory fields. Make sure it has fields for the Name, Category, and Type, even if their values are empty.

    255

    An unrecognized error occurred.

By default, the master node listens for Cluster Events on port 14003. You can change this default in your nDisplay configuration file. See Changing Communication Ports below.

To respond to these Cluster Events in your Project's Blueprint code, see Responding to Cluster Events in Blueprints.

Emitting Cluster Events from the nDisplay Launcher

While your nDisplay cluster is running, you can use the nDisplayLauncher application to send Cluster Events in to your master node at any time.

  1. Start up your nDisplay cluster as usual.

  2. Switch to the Cluster events tab. You'll use this tab to set up Cluster Events and send them to your cluster on demand.

  3. Click New to add a new Cluster Event to the list on this tab.
    Cluster events tab

  4. Use the Cluster event editor window to set up the Name, Type, and Category values for the Cluster Event.
    Settings in the Cluster event editor

  5. You can also add key-value pairs to the Cluster Event's list or parameters. Set the key in the Argument field, set the value in the Value field, then click the + button.
    Parameters in the Cluster event editor

    If you need to modify a parameter after you create it, select it in the list on the right and click - to delete it. Then recreate a new parameter with the settings you need.

  6. Click Apply to save your new Cluster Event.

  7. Back in the Cluster events tab, select your Cluster Event in the list and click Send.
    Send the Cluster Event

    You can also click Modify to reopen the selected Cluster Event to edit its values.

  8. Watch the log output panel at the bottom of the nDisplayLauncher to see the response from the master node.
    clusterevent-emit-launcher-response.png

To respond to these Cluster Events in your Project's Blueprint code, see Responding to Cluster Events in Blueprint.

Responding to Cluster Events in Blueprints

Once you've set up one of the methods described above to emit Cluster Events into your nDisplay network, you'll want to set up your Blueprint (or C++) gameplay logic to detect those Cluster Events and respond to them in some way. To do this, you need to create and register a listener: a class that implements the DisplayClusterClusterEventListener interface. You register the listener by calling the Add Cluster Event Listener function from the nDisplay API, then use the Event On Cluster Event node to detect Cluster Events and respond to them.

For example, to create a new Blueprint class and register it as a listener:

  1. In the Content Browser, right-click and choose Create Basic Asset > Blueprint Class.
    Create Blueprint Class

  2. Choose Actor as the parent class.
    Actor

  3. Type a name for your new listener class in the Content Browser.
    Rename the class

  4. Drag your class into the Level Viewport and drop it into your Level.
    Drag and drop the Blueprint into the Level

  5. Double-click your new Blueprint class to edit it.

  6. In the Toolbar, click Class Settings.
    Class Settings

  7. In the Details panel, find the Interfaces > Implemented Interfaces setting and click Add.
    Add interface

  8. Find and select the DisplayClusterClusterEventListener interface in the list.
    DisplayClusterClusterEventListener

  9. Click Compile in the Toolbar to compile your class.

  10. On the Event Graph tab, set up the following graph to register your listener:

    To set this up:

    1. Drag right from the output of the Begin Play Event node and choose N Display > Get DisplayCluster Module API.

    2. Drag right from the Out API port of that node and choose Display Cluster > Cluster > Add cluster event listener (Interface Call).

    3. Finally, drag left from the Listener port of the Add cluster event listener node, and choose Variables > Get a reference to self.

  11. It's a good idea to also destroy each listener you create when you know you won't need it anymore. For example, you can do it when your Blueprint Actor is destroyed:

    To set this up:

    1. Right-click in the Event Graph and choose the Add Event > Event Destroyed node.

    2. Drag right from the output of the Event Destroyed node and choose N Display > Get DisplayCluster Module API.

    3. Drag right from the Out API port of that node and choose Display Cluster > Cluster > Remove cluster event listener (Interface Call).

    4. Finally, drag left from the Listener port of the Remove cluster event listener node, and choose Variables > Get a reference to self.

  12. In another area of the Event Graph, add the Add Event > N Display > Event On Cluster Event node. Every time a Cluster Event happens in your nDisplay cluster, this event will trigger.
    You will probably want to read the settings and parameters assigned to this event, so that you can use them to determine what action your Blueprint needs to take. To do this, drag right from the Event port of the Event On Cluster Event node, and choose Break DisplayClusterClusterEvent.
    For example, this graph simply prints out the Name value of each Cluster Event to the screen:

  13. Compile and Save your Blueprint class.

The next time any Cluster Event is emitted in the cluster, from any source, the name of that Cluster Event gets printed to the screen.

Changing Communication Ports

The nDisplay system communicates between hosts over three TCP/IP ports: 14000, 14001, and 14002. You need to make sure you have these ports open on all computers.

If you want to change the port numbers yourself, you can do so in the following places.

  • Runtime synchronization ports - The master node uses two ports to synchronize data with the other nodes in the cluster. To set these two ports, include the port_cs and port_ss configuration parameters in your configuration file, on the cluster_node line that defines your master node. For example:

    [cluster_node] id=node_front addr=192.168.0.1 screen=screen_front viewport=vp_front port_cs=42001 port_ss=42002 master=true
  • Cluster event ports - The master node always uses the same port to exchange cluster events with connected clients. This includes both other nodes in the nDisplay cluster, and any external applications that you write to send and retrieve cluster events. To set this port, include the port_ce configuration parameter in your configuration file, on the cluster_node line that defines your master node. For example:

    [cluster_node] id=node_front addr=192.168.0.1 screen=screen_front viewport=vp_front port_ce=42003 master=true
  • nDisplay Launcher and nDisplay listener ports - The nDisplay Launcher and nDisplay Listener both need to be configured to use the same communication port. You can specify this on the command line when you start up these applications.
    When you start the nDisplay Launcher, use the listener_port argument. For example:

    nDisplayLauncher.exe listener_port=15003

    In addition, you'll have to start the nDisplayListener application on each host yourself, with the port argument. For example:

    nDisplayListener.exe port=15003

Adding nDisplay to an Existing Project

You don't have to use the nDisplay Template Project in order to render through nDisplay. If you already have a different Project set up with your content, you can adjust that Project so that it can take advantage of nDisplay.

In the following steps, you'll set up your Project with a special GameMode that contains custom classes pre-built to work with nDisplay. You cannot currently use nDisplay with your own choice of GameMode, Pawn, and Controller classes.

To set up an existing Project to use nDisplay:

  1. Enable the nDisplay plugin.
    In the Unreal Editor, choose Edit > Plugins from the main menu. Search for "nDisplay", and check the Enabled checkbox.

  2. Enable nDisplay for your Project.
    Choose Edit > Project Settings from the main menu, and find the Plugins > nDisplay section. Check the Enabled checkbox.

  3. Still in the Project Settings window, go to the Project > Description section, and check the Settings > Use Borderless Window checkbox.

  4. Restart the Unreal Editor, reopen your Project, and open your Project's default Level.

  5. In the World Settings panel, set the Game Mode > GameMode Override setting to DisplayClusterGameModeDefault.

  6. Add a new DisplayClusterSettings Actor to your Level.
    You can find this Actor in the Modes panel, on the All Classes list.

  7. Continue on with the rest of the setup instructions under Getting Started above.

nDisplay Launcher UI Reference

This section describes all of the settings and options available in the user interface of the nDisplay Launcher.

Launcher Tab

Control

Description

Render API

Specifies the rendering API to use the next time you click Run.

Render mode

Specifies the type of output nDisplay produces on every cluster node. 

  • Mono - a single monoscopic rendering of the scene from the point of view of the camera.

  • Frame sequential - active quad buffer stereo.

  • Side-by-side - passive horizontally aligned stereo.

  • Top-bottom - passive vertically aligned stereo.

Mono does not require any specific hardware features, but frame sequential does. Make sure your display device, GPU and diver settings are compatible with the render mode you choose.

Use All Available Cores

Forces each Unreal Engine instance to use all available processors on its host.
When this option is selected, the nDisplay Launcher adds the USEALLAVAILABLECORES option to the command line it uses to launch each instance.

No Texture Streaming

Disables texture streaming for each Unreal instance. Highest quality textures are always loaded.
When this option is selected, the nDisplay Launcher adds the NOTEXTURESTREAMING option to the command line it uses to launch each instance.

Custom command line arguments

If you want the nDisplay Launcher to pass any additional arguments on the command line it uses to launch each Unreal instance, include them here.

For details, see the Command-Line Arguments reference.

Custom ExecCmds

If you want the nDisplay Launcher to pass any console commands for the Unreal Engine to execute at startup, enter them here. The nDisplay Launcher passes these commands to each instance of your Unreal Engine application using the -ExecCmds command-line parameter.

Applications

Lists all packaged Unreal applications that you can run with the nDisplay Launcher. Use the Add and Delete buttons below the panel to edit the list.

For more, see Step 3. Package and Deploy above.

Config Files

Lists all configuration files that you've set up for the nDisplay Launcher. Use the Add and Delete buttons to the right of the drop-down to edit the list.

For more, see Step 3. Package and Deploy above.

Run

Tries to connect to every cluster node that you've configured in the file that you've selected in the Config Files list, and tells the nDisplay Listener application to launch the packaged Unreal Engine application that you've selected in the Applications list.

Kill

Tries to connect to every cluster node that you've configured in the file that you've selected in the Config Files list, and tells the nDisplay Listener application to shut down the packaged Unreal Engine application that you've selected in the Applications list.

Cluster Events Tab

You can use the Cluster events tab to set up new Cluster Events and send them to your nDisplay cluster.

Control

Description

New

Click to set up a new Cluster Event in the Cluster event editor and add it to the list view.
Cluster event editor
See Emitting Cluster Events from the nDisplay Launcher.

Modify

If you have a Cluster Event selected in the list view, click Modify to open that Cluster Event in the Cluster event editor so you can edit its values.

Delete

If you have a Cluster Event selected in the list view, click Delete to remove that Cluster Event from the list.

Send

If you have a Cluster Event selected in the list view, and you have an nDisplay cluster currently up and running, click Send to emit the selected Cluster Node to your nDisplay cluster. Check the output log panel to see the response sent back by the master node.

Logs Tab

By default, nDisplay saves log messages from a variety of sources to the Saved/Logs folder next to your packaged application. You can use the Logs tab to configure the level of information that nDisplay logs from each different source.

Control

Description

Use custom log settings

When checked, nDisplay uses the log levels you set on this page for each different source that is listed on the left.

For each

Click any of these buttons to set all sources on the left to log messages of the selected severity or higher.

Log Output Panel

The log output panel lists the results of all the commands you carry out in the nDisplay Launcher.

Control

Description

Copy

Copies to the clipboard all messages listed in the log window to the left.

Clean

Clears all messages from the log window to the left.

Welcome to the new Unreal Engine 4 Documentation site!

We're working on lots of new features including a feedback system so you can tell us how we are doing. It's not quite ready for use in the wild yet, so head over to the Documentation Feedback forum to tell us about this page or call out any issues you are encountering in the meantime.

We'll be sure to let you know when the new system is up and running.

Post Feedback