Live Link Plugin

Describes how to enable and use the Live Link Plugin and features within the animation system in Unreal Engine 4.

Choose your operating system:

Windows

macOS

Linux

The purpose of Live Link is to provide a common interface for streaming and consuming animation data from external sources (for example, DDC tools or Mocap Servers) into Unreal Engine 4 (UE4). It is designed to be extensible through Unreal Plugins, allowing third parties to develop new features—hopefully with no need to make, then maintain Engine changes.

With DDC tools like Maya or Motionbuilder, Live Link provides a way for you to edit animations externally while seeing a preview of your work inside UE4 in real-time. Motion Capture Systems can also use Live Link to stream data into the Engine that can be previewed in real-time.

Live Link Client

The hub of Live Link is the Live Link Client that is created when the Live Link plugin is enabled inside a project. It has two main responsibilities:

  1. Manage Sources : These are the sources of data and mostly represent connections to other applications, either on the same machine or across the network, that provide data to Live Link.

  2. Manage Subjects : Subjects are individual streams of data within the client. One animating character would be a subject for instance.

    A subject is made up of the following data:

    1. A name.

    2. Static data that doesn't change (like the Skeleton for an animation role.).

    3. One or more "frames" of data (like the bone transforms for the animation role).

The client is also responsible for building up the next frame of data to be used by the Engine. This can either be a straight copy of the latest received data or an interpolated frame created by buffering the incoming data and playing it back with a user-definable delay.

Roles

Live Link also utilizes a concept of Roles that define how incoming data should be used. This allows for the data to be more easily mapped to a target Actor within the Engine. Supported Roles include Cameras, Lights, Characters, Transforms, and a Basic role (for generic data).

Sources

Sources are how data gets into the Live Link client. Sources can be defined within plugins so that third parties can build their own without having to change Engine code. Sources are responsible for managing how they receive the animation data (over a network protocol for example, or reading from an API for a device connected to the machine itself). Each source is handed a reference to the client to allow it to pass data to it. Within the Live Link plugin, we have defined our own source (called the Message Bus Source ) that reads data from an Unreal Message Bus connection. We have used this to build Live Link plugins for Maya and Motionbuilder.

UDP Messaging

When using the Live Link Message Bus Source, it uses UDP Messaging under the hood and (by default) will use the first Network Adapter it finds. If you have more than one Network Adapter in your machine, this may cause issues if you expect to receive data from a particular adapter. If you want to receive UDP data on a specific Network Adapter, you will need to modify your Project Settings by going to Edit > Project Settings > UDP Messaging and changing the Unicast Endpoint .

UDPMessaging.png

For example, if you have two adapters:

  • Adapter A: XX.X.XXX.123

  • Adapter B: XXX.XXX.X.53

If you want to receive data from the Live Link Message Bus Source on Network Adapter B, you would need to set your Unicast Endpoint to: XXX.XXX.X.53:0

The ":0" specifies that all ports should be listed to.

UDP Messaging is not enabled by default in -game. You can enable it by adding -messaging in a packaged game (shipping target is not supported).

The Live Link plugin can be enabled by opening the Plugins window ( Edit > Plugins ), selecting the Animation category and selecting Enabled on the Live Link Plugin.

Step_02-1.png

Live Link Connection Window

Once enabled. the Live Link client can be accessed from the Window menu.

EnableLink_1-1.png

Upon selecting the Live Link option, the Live Link Connection window opens, where you can add Source Types and Subjects.

SourceSettings.png

Above we can see the Live Link client with an open connection to an instance of Maya running our plugin (top left section). That instance is sending three subjects to the Editor: two Camera subjects (one named "EditorActiveCamera" and another named "camera1"), as well as a subject containing Transform data called "TestCube" (bottom left section).

Adding a Source

You can add sources through the + Source button and select the type of source you want to connect with Live Link.

AddSourceType.png

Your external application will need to be set up to push data to Unreal Engine 4 through Live Link for it to be displayed as a source to connect to. See the Live Link Plugin Development page for more information.

In addition to receiving data through a Message Bus Source, Live Link supports Hand Tracking sources for devices like Magic Leap, as well as the ability to create Virtual Subjects that allow you to combine multiple Subjects into one "Virtual Subject". For example, you could take the lower body from Character A and the upper body from Character B, then combine them into a new Subject. Or you could use the camera tracking data from one source and combine just the translation from another tracked object and drive it manually.

In the Sources panel, you can manage all your connected Sources. You can also delete a Source by clicking the Trashcan icon next to a Source.

You can also click the Presets button to save or load any previously saved presets.

SaveAsPreset.png

Presets are saved as assets inside the Content Browser and enable you to quickly load up any previous configurations.

Subject Panel

The Subjects panel of the Live Link Connection window indicates a connected source and the subjects that are being streamed in.

SourceSettings.png

This panel indicates the name of each subject, their associated role, and a status indicator where a green light is displayed if you are receiving data. A yellow light is displayed if you are connected to the Source but you didn't recieve data in the specified amount of time (configurable in the Project Settings, the default is 0.5ms).

Connection Settings

Once you establish an active connection, the following settings can be used to define the parameters of the connection:

Property

Description

Evaluation Mode

Determines how to create the frame snapshot.

Evaluation Mode

Description

Engine Time

The source will use the Engine's time to evaluate its subjects. This mode is most useful when smooth animation is desired.

Latest

The source will use the latest frame available to evaluate its subjects. This mode will not attempt any type of interpolation or time synchronization.

Timecode

The source will use the Engine's timecode to evaluate its subjects. This mode is most useful when sources need to be synchronized with multiple other external inputs (such as video or other time synchronizes sources).

Should not be used when the Engine is not set up with a Timecode provider.

Valid Engine Time

If the frame is older than ValidTIme, remove it from the buffer list (in seconds).

Engine Time Offset

When evaluating with time, how far back from current time should the buffer be read (in seconds).

Timecode Frame Rate

When evaluating with timecode, what is the expected frame rate of the timecode.

Valid Timecode Frame

If the frame timecode is older than ValidTimecodeFrame, remove it from the buffer list (in TimecodeFrameRate).

Timecode Frame Offset

When evaluating with timecode, how far back from the current timecode should the buffer be read (in TimecodeFramerRate).

Max Number of Frame to Buffered

Defines the maximum number of frames to keep in memory.

Source Debug Infos

A collection of debugging information passed through from the source.

Interpolation Settings

After a Subject has been added, you can assign any Pre Processors , Interpolation methods, or Translators to the selected Subject.

LiveLinkConnected-1.png

Property

Description

Pre Processors

These are good for processing the incoming data before it is pushed to the Subjects for a given frame. One possible use case is the Axis Switch , that will switch any axis of an incoming transform with another axis.

Interpolation

Defines the type of interpolation method used for blending frames.

Translators

These enable you to transform data from one role to another. For example, if you wanted to translate data from a Skeleton to a Transform. This could be useful if you wanted to just get the location of the hips of a character. This will transform the data into the correct role.

Editor Integration

Currently, the Animation Editors have a built-in integration with Live Link. This is accessed in the Preview Scene Settings tab under the Preview Controller property.

PreviewSceneSettings-3.png

When the Live Link plugin is enabled, the Preview Controller option can be changed to Live Link Preview Controller . Once selected, you can set the following options:

  • Subject Name : The name of the subject in Live Link you would like to apply to the Preview Mesh.

  • Enable Camera Sync : Enables syncing of the Unreal Editor camera with an external editor. Internally this looks at Live Link for a subject called EditorActiveCamera . Both our internally developed Maya and Motionbuilder plugins support this.

  • Retarget Asset : This specifies the Retarget Asset to apply to the Live Link data that will be applied to the Preview Mesh.

Below we have an established connection between UE4 and Maya, using the Maya Live Link Plugin.

Live Link Components

The Live Link Controller and Live Link Skeletal Animation components can be added to an Actor in order to drive its parameters with Live Link from a connected external source.

To use these components, click the Add Component button and use the Live Link Controller (or Live Link Skeletal Animation) component.

AddLiveLinkComponent.png

For the Live Link Controller, in the Details panel, you can use the Subject Representation property and select from your connected Subjects. Based on the Subject, a role will automatically be assigned (you can change if needed). The Component to Control is what will actually be driven through Live Link. In the example below, we have a Cine Camera Actor with a Live Link Controller component that allows us to move the camera and change the Focal Length from Maya. We also use the Live Link Skeletal Animation component on a Skeletal Mesh and stream in animation data. To achieve this, the Live Link Pose node has been added to our Animation Blueprint and our Subject has been selected.

The Live Link Controller can control Animation, however, it is recommended that the Live Link Skeletal Animation component be used for Animation for best results.

Live Link in Blueprint

You can also use Blueprint function calls to access Live Link data. Below, the Evaluate Live Link Frame function attempts to get a Live Link Frame from a specified subject using a given role (in the case below, the Subject "camera1" and the role of Camera is accessed).

LiveLinkInBlueprints.png

We can then get frame data from the data result, and in this case, the transform information from our Subject. That information is then used to update the relative transform of a Cine Camera within our Blueprint.

If you are using the Maya Live Link Plugin prior to 4.23, when driving a camera through Blueprint, it may not have the correct transform. You can fix this by adding an Add Relative Rotation node set to 0, 180, 90 as indicated below.

422CameraFix.png

Live Link Retargeting

Live Link retargeting is currently performed by retargeting assets (Base class ULiveLinkRetargetAsset). The Live Link pose node has a pin for specifying the retarget asset to use. We provide a very simple remap asset (ULiveLinkRemapAsset) to allow mapping of transforms from Live Link on to a USkeleton. Below is an example of how to implement a retargeting asset.

A Live Link Retarget Asset has one function it needs to override:

BuildPoseForSubject(const FLiveLinkSubjectFrame& InFrame, TSharedPtr<FLiveLinkRetargetContext> InOutContext, FCompactPose& OutPose, FBlendedCurve& OutCurve)

This function takes:

  • A LiveLinkSubjectFrame : the transform/skeleton as Live Link sees it, optionally buffered and interpolated.

  • A FLiveLinkRetargetContext : Allows for retargeting instance data, as the ULiveLinkRetargetAsset will not be instanced for each use (can be customized by overriding CreateRetargetContext).

  • An FCompactPose : This is the output pose from the retargeting FCompactPose is the format we use during animation evaluation.

  • An FBlendedCurve : This is the output curve data. ULiveLinkRetargetAsset has a helper function BuildCurveData t will populate this from the supplied Live Link data.

Retarget Assets can be Blueprintable as well, allowing users to specify logic in Editor (where appropriate). This functionality is used by the Remap Asset to allow bone names to be transformed in the Editor.

Live Link Remap Asset

An example of what is required to create a new retarget asset can be found in the editor by looking at the following classes in /Engine/Plugins/Animation/LiveLink/Source/LiveLink:

  • ULiveLinkRetargetAsset

  • ULiveLinkRemapAsset

In Editor, Remap Assets can be created by using the LiveLinkRemapAsset class type.

RemapBlueprint-3.png

Inside the remap asset, you can override the function Get Remapped Bone Name .

RemapBlueprint2.png

This will enable you to transform the Bone Name in Editor.

GetRemappedBone.png

In the example above, we are taking in Bone Name information from a Skeletal Mesh in Maya. That data is then being modified and mapped to the Bone Name naming convention in UE4.

For example in Maya, the naming convention used refers to our Bone Names as CharacterName:BoneName. In UE4, the Skeleton Asset uses just the BoneName portion. In order for Maya to stream data to the UE4 Skeleton, we use the Remap Asset and override the function Get Remapped Bone Name by splitting the string at the colon and only taking the string information following the colon, so just BoneName instead of CharacterName:BoneName.

In the Preview Scene Settings window, we then specify to use the Retarget Asset.

RemapBlueprint3.png

Animation Blueprints

Data from Live Link can be applied directly inside Animation Blueprints:

AnimBPLiveLink.png

This is achieved by creating a Live Link Pose node inside the AnimGraph of an Animation Blueprint. The node has 2 properties:

  • Subject Name : The name of the subject in live link to stream data from

  • Retarget Asset : The Retarget Asset to use to apply the data from Live Link to the Skeleton that the animation Blueprint uses.

The output of the Live Link Pose node is a normal pose like any other pose node in an Anim Blueprint, and therefore can be manipulated like other poses (fed into Modifier or Blend nodes for instance).

While you can drive animation without the Live Link Pose node in the Editor, if you want to drive animation at runtime, you will need the Live Link Pose node set up in your Animation Blueprint.

Motion Controllers

Live Link can be used with Motion Controllers. The motion source property of a Motion Controller can be set to a subject within Live Link.

MotionController.png

When set in this way, the position of the Motion Controller component is governed by the first transform of the subject.

The Motion Controller integration can also access custom parameters on the Live Link subject. These are passed by the curve support built into Live Link subjects. To access the values, it is necessary to derive a new Blueprint from MotionControllerComponent and override the OnMotionControllerUpdated function. During OnMotionControllerUpdated it is valid to call GetParameterValue on the Motion Controller.

Plugin Development

There are two paths for integrating with Live Link:

  • Building an Unreal Engine plugin that exposes a new Source to Live Link.

This is the recommended approach for anyone that already has there own streaming protocol.

  • Integrating a Message Bus endpoint in third-party software to allow it to act as a data transmitter for the built-in Message Bus Source.

This is the approach we have taken for our Maya and Motionbuilder plugins.

For more information, please see the Live Link Plugin Development page.

Maya Live Link Plugin

Prebuilt binaries of the Maya Live Link plugin are available from the Maya Live Link Plugin UE4 GitHub repository. Once you download and extract the zip files, you will see binaries for Maya 2016-2019 for Windows.

The Maya plugin is supplied in the Maya Module format. To install the plugin, copy the contents of the correct versioned folder within the zip file to your Maya Module folder. Once the Live Link plugin is installed in Maya and enabled, it will automatically show as a connection in the Unreal Editor. A window can be opened within Maya to access the streaming settings. This is spawned by the MEL command MayaLiveLinkUI and can be bound to a toolbar button.

MayaLiveLinkUI.png

At the top right is a display that shows whether this instance of Maya is connected to an Unreal client. Below Unreal Engine Live Link, is a list of all the Subjects currently being streamed. You can click the minus (-) button to remove a Subject, or add any currently selected item (or items) from your scene with the Add Selection button. You can also assign the type of data being streamed with the Stream Type option.

For a step-by-step guide on how to set up the Maya Live Link plugin, please see the Connecting Unreal Engine 4 to Maya with Live Link How-to guide.

Motionbuilder Live Link Plugin

The Motionbuilder Plugin offers the same functionality as the Maya plugin and shows up in Editor as a connection in a similar way. It also has a custom UI for managing streaming:

MotionBuilderLiveLinkPlugin.png

Objects can be selected from the current scene and added to the streamed list (as shown above). From there, their names can be set in the Subject Name column and their Stream Type (Camera, Skeleton, for example) can be set. Streaming on the subject can also be enabled and disabled from here.

You can download the Motionbuilder Live Link Plugin binares from the UE4 GitHub repository.

For a step-by-step guide on how to set up the Motionbuilder Live Link plugin, please see the Connecting UE4 to Motionbuilder with Live Link How-to guide.

ART Tracking Live Link Plugin

ART Tracking (Advanced Real-time Tracking) is supported through Live Link for the following use-cases:

  • Body trackers

  • Flystick trackers

  • Hand trackers

  • Human model

ART Tracking through Live Link provides the ability to leverage ART technology for various tracking purposes in applications such as VR, Augmented Reality and Motion Capture.

ART will be distributing a LiveLink plugin on the Marketplace and also hosting the latest development on Github.

Help shape the future of Unreal Engine documentation! Tell us how we're doing so we can serve you better.
Take our survey
Dismiss