Choose your operating system:
Windows
macOS
Linux
Goals
In this Quick Start guide, you will go through the steps of connecting and calibrating a video feed using the Camera Calibration plugin.
Objectives
Connect your camera to Unreal Engine so it provides a live feed.
Calibrate the camera by using the Camera Calibration plugin.
1 - Required Setup
For this guide we will use a Blackmagic Ultra HD camera, a Panasonic Lumix lens and an HTC Vive Tracker 3 to control the CineCamera Actor in the scene.
Unreal Engine supports only a limited range of ultra-wide camera lenses at this time.
Create a new Unreal Engine project. Select the Film, Television, and Lived Events category and click the Next button.
Select the Virtual Production template and click the Next button.
Enter the file location and project name and click the Create Project button.
Once the editor is loaded, click Settings > Plugins to open the Plugins Menu.
Select the Virtual Production category and Enable the Camera Calibration and LiveLinkXR plugins. Select Yes on the popup box and click the Restart Now button to restart the editor.
Section Results
You enabled the Camera Calibration and LiveLinkXR plugins and restarted the editor. You are now ready to calibrate your camera.
2 - Setting your Scene
Go to the Place Actors panel and search for CineCamera Actor. Drag the Actor into your Level.
In the Place Actors panel search for Camera Calibration Checkerboard. Drag the Actor into your Level.
In the Content Browser, click the View Options button and select the Show Engine Content and Show Plugin Content options.
In the Content Browser, navigate to CameraCalibration Content > Devices > Tracker. Drag the BP_UE_Tracker3 Blueprint into your Level.
Select the CineCamera Actor in your Level and navigate to the Details panel. Scroll down to the Filmback section and enter the matching Sensor Width and Sensor Height values for your physical camera.
With the CineCamera Actor selected, go to the Details panel and click the Add Component button. Search for and select the Live Link Controller.
Go to Window > Live Link to open the Live Link window.
Go to Source > LiveLinkXR Source and click the Add button to add your connected Vive trackers. You should now see the trackers connected using Live Link.
You can learn about developing for HTC Vive and SteamVR devices by reading the Developing for SteamVR documentation.
Since we aren't using an actual FIZ source, we need to create a virtual source using LiveLink Blueprint Virtual Subjects. Right-click in the Content Browser and select LiveLink > Blueprint Virtual Subject. Click the dropdown, select LiveLinkCameraRole and click the OK button. Name the Blueprint VirtualPrestonFIZ.
Double-click the VirtualPrestonFIZ Blueprint to open it. Click the + Variable button to add a new variable. Name the variable Focus. Go to the Details panel and set the Variable Type to Float. Enable the Instance Editable checkbox.
Repeat the above step and create two additional Float variables named Zoom and Iris.
Right-click in the Event Graph then search for and select Update Virtual Subject Static Data. Connect the Update Virtual Subject Static Data node to the Event On Initialize node. Right-click the Static Data pin in the Update Virtual Subject Static Data node and select Split Struct Pin.
Enable the Focal Length, Aperture and Focus Distance checkboxes, as seen below.
Right-click in the Event Graph then search for and select Update Virtual Subject Frame Data. Connect the Update Virtual Subject Frame Data node to the Event On Update node. Right-click the Frame Data pin in the Update Virtual Subject Frame Data node and select Split Struct Pin. This event is triggered on every tick and will be used to update the FIZ data available for each frame.
Connect the Zoom variable to the Focal Length pin. Connect the Iris variable to the Aperture pin. Connect the Focus variable to the Focus Distance pin. Compile and Save the Blueprint.
Go to Source > Add Virtual Subject to open the Virtual Subject window. Select the VirtualPrestonFIZ subject and click the Add button.
Right-click within the Content Browser and select Miscellaneous > Lens File to create a Lens File asset. Name the asset LumixLens.
Select your CineCamera Actor and go to the Details panel. Scroll down to the Live Link section and click on the dropdown menu next to Subject Representation. Select your tracker from the list.
With the CineCamera Actor selected, go to the Details panel and click the Add Component button. Search for and select the Live Link Controller to add another component. Scroll down to the Live Link section and click on the dropdown next to Subject Representation. Select your virtual subject from the list.
In the Details panel, scroll down to the Camera Role section and click the dropdown menu next to Lens File. Search for and select LumixLens.
Select the BP_UE_Tracker3 Blueprint and go to the Details panel. Select the LiveLink Component Controller and scroll down to the Live Link section. Click the dropdown menu next to Subject Representation and select your tracker.
Select the CameraCalibrationCheckerboard Actor and go to the Details panel. Scroll down to the Calibration section and enter the number of rows and columns, as well as the Square Side Length. For this example, we are using the image below for the measurements.
For better visibility, add materials to the Odd Cube Material and Even Cube Material slots.
Section Results
In this section you added a CineCamera Actor, a Camera Calibration Checkerboard Actor, and a BP_UE_Tracker3 Blueprint to your Level. You connected your tracker using Live Link and configured your actors correctly. You are now ready to calibrate your lens.
3 - Calibrating your Lens
In the Content Browser, double-click the LumixLens asset to open it. Click the Calibration Steps panel and select the Lens Information tab. Go to the Lens Info section, enter your lens information and click the Save Lens Information button.
Click the Lens File Panel and select Focus.
Click the + button and enter a value of 0 for Input Focus. Enter a value of 100 for Encoder Mapping. Click the Add button to add this data point to the graph. Repeat this step and enter a value of 1000 for Encoder Mapping and a value of 1 for Input Focus.
Select Iris and enter a value of 1.8 for Encoder Mapping and a value of 0 for Input Focus. Click the Add button to add the data point to the graph. Repeat this step and enter a value of 4.5 for Encoder Mapping and a value of 1 for Input Iris.
Go back to the Calibration Steps panel and select the Lens Distortion tab. Open the checkerboard image in full screen and point your camera towards it. With the checkerboard in full view, click the viewport to capture an image. Repeat this step multiple times to capture several images.
Repeat this process at multiple zoom levels and focus values for the best possible calibration.
Once you have taken several images from different angles, click the Add To Lens Distortion Calibration button. Click OK on the popup window to accept the calibration data.
Select the Nodal Offset tab and under the Nodal Offset section click the Nodal Offset Algo dropdown menu. Select Nodal Offset Checkerboard.
Point the camera toward the checkerboard on the screen again and click the viewport. This will detect the corners of the image. Click the Apply to Calibrator button to align the Camera Calibration Checkerboard Actor with the onscreen checkerboard .
Now you can calibrate the Nodal Offset of the virtual camera. Set the Nodal Offset Algo to Nodal Offset Points Method, and the Calibrator to your tracker. Set the Calibration Point to PointLed.
Hold the tracker's light towards the camera and click the light in the viewport. Repeat this step multiple times to create several data points. Click on the Add To Nodal Offset Calibration button.
If the tracking is not as accurate as expected, you can repeat this process using additional points.
Close the Lens File window and verify that the CineCamera Actor is now moving correctly as you move your tracker.'
Section Results
In this section you used the Lens File to enter your lens information, calibrate lens distortion, and add the correct nodal offset. Your CineCamera Actor is now simulating your physical camera's position, rotation, and lens distortion.