In-Camera VFX Quick Start

A guide to your first steps working with in-camera VFX in an LED Volume

Choose your operating system:

Windows

macOS

Linux

This Quick Start page shows the process of setting up a project in Unreal Engine to work with in-camera VFX. At the end of this guide, you will:

  • Have a synchronized cluster of nDisplay nodes.

  • Have an inner and outer frustum for in-camera VFX.

  • Have a real-time camera tracking system integrated via Live Link.

  • Have a green screen with chroma key markers that can be toggled on.

  • Have Color Correction Volumes adjust the lighting and color to match the physical set.

  • Be able to launch all clustered nodes and test on set.

Step 1 - Set up Your Project for In-Camera VFX

The easiest way to set up an in-camera VFX project is to use the In-Camera VFX Template .

  1. Launch Unreal Engine , and click Create Project .

  2. Select the Film, Television, and Live Events template category, then click Next .

    Film, Television and Live Events template category

  3. Click InCamera VFX , then click Next .

    InCamera VFX template selection

  4. Choose whether to include starter content and whether to enable retracing, and select the path and name for your project.

  5. Click Create Project .

A new project will be created from the template. Under Content > InCamVFXBP > Maps you will find three example maps , LED_CornerStage , LED_CornerStage2 , and LED_CurvedStage , each with associated tools and assets. For more information about the template, refer to the documentation linked above.

If the version of UE you are working with doesn't have the InCamera VFX template, you can create your project using the In-Camera VFX Example project.

  1. Open the Epic Games Launcher .

  2. In the Learn tab, find the In-Camera VFX Example project.

  3. On the project page, click Free .

  4. Click Create Project .

  5. Specify the location on your machine to save the project and select Create .

  6. Launch Unreal Engine and open the In-Camera VFX Example project.

In the example project, under Content > Maps , there are two levels, Main and EmptyStage . For learning about in-camera VFX with Unreal Engine, open the Main level. For starting with a clean setup, use EmptyStage as the base for your project. The example project and levels automatically enable the necessary plugins, provide helpful blueprints, configure additional settings, and include sample configuration files.

Plugins

  • nDisplay: Unreal Engine's technology for rendering on multiple displays.

  • Live Link: Unreal Engine's API for ingesting live data, such as motion capture and camera tracking.

  • Live Link Over nDisplay: The primary node receives Live Link data and redistributes the tracking data in an efficient and synchronized manner.

  • Multi-User Editing: Multiple editors can be in a shared session.

  • Virtual Production Utilities: Utility plugins useful for Virtual Production.

  • Media Framework Utilities: Utility plugins related to live video, timecode, and genlock on SDI capture cards.

  • Aja or Blackmagic Media Player: Provides support for SDI capture cards.

Blueprints

In the In-Camera VFX Example project, you can see that the levels have been set up with a very specific hierarchy. This hierarchy is used to control the stage positioning, and ensures that both the physical stage and the virtual stage are represented.

Blueprints included in the in-camera VFX example project

Blueprint Object

Description

BP_StageOrigin

Stage Origin is a top-level control for moving all tracked cameras, characters, and props attached to the stage as child actors.

BP_InCameraStageSettings

This Asset contains settings to customize the stage. A default chroma-key marker is provided in the project, but you can change the texture on this Asset in the Chromakey section of the Details panel. In the Rendering section, you can add objects to the Hidden Layers array if you want the objects to be in the level but not display on the LED wall. You can specify light card layers in the Light Cards section. These lights are only visible in the outer frustum in order to provide light to the LED stage.

BP_SamplePawn

This Object creates a starting position for the tracked camera in the virtual world and drives the overall movement of that camera. The Pawn should include a Camera component to function properly and needs to have Auto Possess Player set to Player 0.

BP_IncameraSettings

This Blueprint assigns the camera used for the inner frustum. This camera assignment is specified in the Reprojected Cine Camera property, located in the Default section of the Details panel. The Blueprint also determines the blending region of the inner frustum and the FOV multiplier found in the Incamera Frame section.

BP_WarpMonitor

This Blueprint assigns the static meshes used to describe the geometry of the screens and to warp the viewports. The meshes need to be to scale in centimeters of the LED screen and offset relative to the tracking origin.

InnerFrustumCamera

A Cine Camera Asset that drives the camera tracking in the virtual world.

GreenScreenPlane

This is the Plane that is attached to the Cine Camera that the chroma-key material is applied to. Toggle this Asset's visibility to turn the green screen on or off.

DisplayClusterRootActor

The DisplayClusterRootActor's location defines the perspective from which the outer frustum is projected.

nDisplay Configuration File

The nDisplay configuration file is a description of the relationship and details of the computers and displays. Sample nDisplay configuration files are included in the example. You can find them in your project's folder under /Content/ExampleConfigs , using your machine's file manager. For more information on the nDisplay configuration file, see the nDisplay Configuration File Reference .

Step 2 - Create LED Panel Geometry

This section provides an example of how to create a representation of a curved LED wall. Each LED Volume can be different, so modify these steps to match the dimensions and layout of your display.

These steps show how to create the geometry to represent the real-world LED panels. In this example, a curved wall is created with two meshes. There are two meshes because they will be rendered on separate machines. Each square on a mesh represents a cabinet that is 500mm x 500mm with 2.6mm pixel pitch.

Mesh representation of a curved LED wall. Click image for full size.

The meshes should be modeled in a position and orientation to match the real-world LED panels. In this example, they are modeled upright. The geometry should be modeled to scale in cm.

Each mesh should have two UV sets in a particular order. The first UV set is used for calculating the projection for the PICP_Mesh projection policy for nDisplay. The second UV set is used to ensure that the chroma key tracking markers move appropriately across seams between two viewports.

Create the UV sets with the following specifications:

  • The first UV set should be scaled to cover the entire UV space in the range 0-1. This UV set should be unfolded as evenly as possible to avoid stretching. The scaling can be non-uniform. Ensure there is no padding around the edges of the UVs, and that the UVs do not go beyond the range 0–1.

    First UV set for the mesh

  • The second UV set should have the UVs aligned so that they match at the same seams as the actual hardware configuration. They should also have the same aspect ratio as the meshes.

    Second UV set for the mesh

When the meshes are created, export the geometry from the 3D modeling software and import them into the Unreal project.

Step 3 - Define the LED Screens in Your Project

You will need to customize the layout and geometry of the screens in the project to reflect what you have on set. These meshes should be in the same physical position and dimensions as your LED wall in the real world in relation to your tracking system. The tracking system used on set will have a zero point. These meshes should be placed in the same world coordinates as they relate to the tracking system. Work with your tracking provider to find out where the zero point is and measure relative to this zero point to find the offsets.

Follow these steps to modify and customize the layout and geometry of the screens in the engine:

  1. In the project's World Outliner , click Edit BP_WarpMonitor.

    Edit BP_WarpMonitor highlighted in World Outliner

  2. In the Blueprint Editor for BP_WarpMonitor, switch to the Viewport tab.

  3. By default, there are four static meshes in the Example's BP_WarpMonitor. Select and delete these four meshes.

    Default mesh quad in BP_WarpMonitor

  4. Find the two imported meshes and drop them under the root object in the Components panel.

    Imported meshes into BP_WarpMonitor

  5. Rotate and translate the meshes to match the position and orientation of the panels relative to the tracking origin in the real world. In this example, the panels are 135 cm in front, 50 cm to the right, and 13 cm above the tracking origin.

    Positioned and rotated the walls in BP_WarpMonitor to match physical layout of LED wall relative to tracking origin

  6. In the Blueprint Editor , switch to the Event Graph tab.

  7. The functions Assign Warp Mesh to Viewport warp the meshes at runtime. Delete the last two of the four Assign Warp Mesh to Viewport nodes since there are only two meshes now.

    CLick image for full size.

  8. Drag the mesh for the left wall from the Components tab to the Event Graph . Connect it to the Mesh Component input for one of the Assign Warp Mesh to Viewport functions. Rename the Viewport_Id to vp_1 . Repeat this for the right wall mesh and the other Assign Warp Mesh to Viewport function. Rename this second Viewport_Id to vp_2 .

    Click image for full size.

  9. In the Blueprint Editor , select the meshes in the Components tab. In the Rendering section of the Details panel for the meshes, set the Visible parameter to false so the geometry won't be rendered.

  10. Compile and Save the Blueprint.

Step 4 - Set up the nDisplay Configuration File

While you can create these config files from scratch, it is highly recommended to start with one of the provided sample configs and modify it to describe your particular setup. These sample config files include a simple setup and further documentation on each section.

Follow the steps below to modify one of the sample configuration files to match the screen setup in the previous section:

  1. Navigate to your project's folder using your machine's file manager.

  2. Open Content/ExampleConfigs/Simple_InCameraFrustum.cfg in a text editor .

  3. In the Cluster nodes section of the file:

    1. Set one of the cluster nodes to be the primary node by adding master="true" sound="true" .

    2. Change the IP addresses of the nodes to match your machines.

    3. Assign two windows to the cluster nodes.

      [cluster_node] id="node_1" addr="192.168.1.100" window="wnd_1" master="true" sound="true"
      [cluster_node] id="node_2" addr="192.1.68.1.38" window="wnd_2"
  4. In the Application windows section of the file:

    1. Add the resolution of your machine's monitor to the window node. In this example, the resolution of the monitors is 2560x1440.

    2. Any window that is in-camera needs to include the rtt_inner viewport, which is responsible for the inner frustum.

    3. Assign the viewports that match the names in the BP_WarpMonitor Blueprint, vp_1 and vp_2 , to the two windows.

      [window] id="wnd_1" viewports="rtt_inner, vp_1" fullscreen="true" WinX="0" WinY="0" ResX="2560" ResY="1440"
      [window] id="wnd_2" viewports="rtt_inner, vp_2" fullscreen="true" WinX="0" WinY="0" ResX="2560" ResY="1440"
  5. In the Viewports section of the file:

    1. Add the resolution of the LED panels to the corresponding viewports. In this example, since each cabinet is 500mmx500mm with 2.6mm pixel pitch, each cabinet has 192x192 pixels. Therefore, the left wall, a 4x4 grid of cabinets, has the resolution 768x768, and the right wall, a 3x4 grid of cabinets, has 768x576.

    2. Keep the rt_inner viewport the same as the sample config.

      [viewport] id="vp_1"  x="0" y="0" width="768" height="768" projection="proj_warp" buffer_ratio="1"
      [viewport] id="vp_2" x="0" y="0" width="768" height="576" projection="proj_warp" buffer_ratio="1"
      [viewport] id="rtt_inner"   x="0" y="1440" width="1920" height="1080" projection="proj_incamera" rtt=true
  6. In the Projection policies section, add the PICP_Mesh and camera projection policies so nDisplay knows to base the projection on the meshes provided and the viewport is based on a camera.

    [projection] id="proj_warp"      type="picp_mesh"
    [projection] id="proj_incamera"     type="camera"
  7. Leave the rest of the file the same. Select File > Save as and rename the config file. In this example, the final config file should be similar to:

    [info] version="23"
    
    [cluster_node] id="node_1" addr="192.168.1.100" window="wnd_1" master="true" sound="true"
    [cluster_node] id="node_2" addr="192.1.68.1.38" window="wnd_2"
    
    [window] id="wnd_1" viewports="rtt_inner, vp_1" fullscreen="true" WinX="0" WinY="0" ResX="2560" ResY="1440"
    [window] id="wnd_2" viewports="rtt_inner, vp_2" fullscreen="true" WinX="0" WinY="0" ResX="2560" ResY="1440"
    
    [viewport] id="vp_1"  x="0" y="0" width="768" height="768" projection="proj_warp" buffer_ratio="1"
    [viewport] id="vp_2" x="0" y="0" width="768" height="576" projection="proj_warp" buffer_ratio="1"
    [viewport] id="rtt_inner"   x="0" y="1440" width="1920" height="1080" projection="proj_incamera" rtt=true
    
    [projection] id="proj_warp"      type="picp_mesh"
    [projection] id="proj_incamera"     type="camera"
    
    [camera] id="camera_static" loc="X=0,Y=0,Z=0"
    
    [general] swap_sync_policy="1"
    
    [network] cln_conn_tries_amount="10" cln_conn_retry_delay="1000" game_start_timeout="30000" barrier_wait_timeout="5000"

For more details on the nDisplay Configuration file, see the nDisplay Configuration File Reference . You can read about the Mesh Based and PICP_Mesh projection policies in nDisplay Projection Policies .

If your file is misconfigured in any way, the render nodes won't be able to launch. Each node is looking for the others to ensure that they are all running. If one is misconfigured, they will all close. If you experience this issue, each node will add a log file to Project/Saved/Logs with more information on the issue.

Step 5 - Launching Your Project with nDisplay

Diagram shows how nDisplay works with a network and display devices for in-camera VFX. Click image for full size.

In an nDisplay setup, there is a primary computer and a cluster of additional computers. The primary computer is the centralized location for managing and dispatching input information. The primary computer also ensures all PCs in the cluster are synchronized and receive input and data at the same time. For more information on an nDisplay setup, see nDisplay Overview .

nDisplayListener

The nDisplayListener is a minimalist application that resides on the primary computer and on each PC in the cluster. The Listener can receive various remote commands; for example, to launch an existing project using a path and argument list, or to terminate an existing project. The Listener must be running before the Launcher.

Follow the steps below to start the nDisplayListener on all of the computers:

  1. Open the folder where Unreal Engine is installed on your machine and navigate to the folder UE_4.25/Engine/Binaries/DotNET/ .

    Location of nDisplayListener and nDisplayLauncher on a user's computer

  2. In that folder, run nDisplayListener.exe .

    A screenshot of nDisplayListener running

Launcher

The Launcher can simultaneously launch multiple projects on a list of available computers that are running the Listener in the background. The Launcher can be run from any computer on the local network.

Follow the steps below to start the nDisplayLauncher on one of the computers:

  1. Open the folder where Unreal Engine is installed on your machine and navigate to the folder UE_4.25/Engine/Binaries/DotNET/ .

    Location of nDisplayListener and nDisplayLauncher on a user's computer

  2. Run nDisplayLauncher.exe to open the nDisplay Launcher Window.

  3. In the nDisplay Launcher Window in the Launcher Tab, select Add Project in Editor -game.

    nDisplay Launcher Window with Add Project in Editor -game highlighted

  4. Navigate to the UE4 executable located in Engine\Binaries\Win64 . Select UE4Editor.exe and click Open .

  5. In the new Select UE4 Project window, navigate to the UE project file that you want to launch. Select the project and click Open . This adds the engine and project paths to the Applications list.

  6. To the right of the Config Files field, select Add . Navigate to the nDisplay config file for the project and select it.

  7. Select the project in the Applications list and click Run .

  8. nDisplay opens and runs the project.

Note: You can also deploy a packaged executable of the project on all the computers. However, with this method, you won't be able to send updates to all the computers without packaging another executable. By running a project in the Editor with -game , you can send changes in the project to other computers in a Multi-User session.

Live Link is a framework in the Unreal Engine for ingesting live data, including cameras, lights, transforms, and basic properties. For in-camera VFX, Live Link plays a critical role in distributing the tracked camera information and can be enabled to work with nDisplay to carry the tracking information to each cluster node. Unreal Engine supports many camera-tracking partners through Live Link, such as Vicon, Stype, Mo-Sys, and Ncam, as well as several other professional tracking solutions.

Note: It's important for this step that you have a Live Link source available.

To track the inner camera frustum via Live Link with nDisplay:

  1. In the Unreal Editor , go to the main menu and select Window . In the dropdown, select Live Link to open the Live Link panel.

    The editor with Live Link highlighted in the Window dropdown

  2. In the Live Link panel, select the Add Source button. In the dropdown, select the Live Link source you want to use.

    Screenshot of Mobu Live Link available when adding Live Link source

    In this example, Motion Builder is the Live Link source.

  3. Click Presets and select Save as Preset .

  4. In the main menu, select Edit > Project Settings .

  5. In Project Settings, under Plugins , select Live Link .

  6. Add the Live Link preset to Default Live Link Preset to make the preset automatically apply when the project runs.

    Live Link Plugins preset changed to the Mobu Live Link Preset

  7. In World Outliner , select InnerFrustumCamera .

  8. In the Components section of the Details panel, select LiveLinkComponentController .

  9. In the Live Link section, change the Subject Representation field to the live link source.

    Screenshot of the Subject Representation field changed in the InnerFrustumCamera's LiveLinkComponentController

  10. Save the project.

  11. In the nDisplay launcher, add -messaging to the custom command line arguments.

  12. Run nDisplay.

  13. The inner camera frustum is now tracked with the live link source.

Step 7 - Green Screen and Chroma Key

You can change what displays in the inner frustum on the LED panels from the virtual world to a green screen with chroma key markers.

Screenshot of green screen toggled on in the engine

Follow these steps to make the green screen visible and to modify the chroma key markers:

  1. In World Outliner, select GreenScreenPlane .

  2. Under the Rendering section in the Details panel, enable the Visible field.

  3. In World Outliner, select BP_IncameraStageSettings .

  4. Under the Chromakey section in the Details panel are options to modify the edge blur and chroma key markers.

Step 8 - Color Correction Volumes

With Color Correction Volumes, you can adjust and correct the colors on the environment and objects in the scene. By adjusting the Color Correction Volumes, you can match the lighting and shadows between the real-world set and the environment displayed on the LED walls.

Screenshot of Color Correction Volumes adding pink to the environment

In the example project, Color Correction Volumes were used to add pink to adjust the environment's colors.

To add a Color Correction Volume to your scene, follow these steps:

  1. In the Content Browser in the editor, open the View Options panel and enable Show Engine Content and Show Plugin Content .

  2. Select the folder icon next to Content . In the list, expand the VirtualProductionUtilities Content and select CCV .

  3. Add the Blueprint BP_ColorCorrectVolume to the scene.

  4. Select the BP_ColorCorrectVolume in the World Outliner .

  5. In the Default section of the Details panel, expand the Color Correct parameter.

  6. Expand the Global parameter and modify the Gamma color to change the color of the volume.

Step 9 - On Your Own

This guide covered setting up displays on LED screens, launching your project on multiple computers, and incorporating camera tracking into the project.

Multi-display setups require syncing capabilities at both software and hardware levels. Not only should the generated content be ready at the same time on all PCs, using the same timing information for simulation, but the display swap (the changing out of the current image for the next image in the video card buffer) needs to also happen at the correct time to prevent tearing artifacts in the display. See synchronization in nDisplay for information on setting up display sync and genlock on the machines to create a seamless view across the multiple displays.

In addition to synchronizing the displays, the engine's timecode and frame generation needs to match the input from the camera. See Timecode and Genlock for steps on how to sync the timecode and genlock the engine between all the devices.

To be able to control the scene and displays during a film shoot, you can try several of the How-Tos for in-camera VFX:

Help shape the future of Unreal Engine documentation! Tell us how we're doing so we can serve you better.
Take our survey
Dismiss