Google has informed developers of a vulnerability in versions (earlier than M102) of WebRTC. Impacts, workarounds, and updates can be found here.
The below features are exciting new tools we've implemented into Pixel Streaming. Though they provide new possibilities, it's important to note that these are unstable and should be used with caution. We recommend you do not build critical components of your product on them as they may change or be removed in subsequent releases of Unreal Engine.
VCam
VCam is a new feature that allows you to use the new VCam actor to stream the video content of the Level Viewport to an output provider.
At this stage, VCam is mostly intended for virtual production use cases. It can be paired with the Live Link VCam iOS application and used for ARKit tracking. This is useful for piloting virtual cameras in Unreal Engine, with Pixel Streaming handling touch events and streaming the Level Viewport as real-time video feedback to the iOS device. For more information on Live Link VCam, please head to this site here: iOS Live Link VCam
How to use VCam
Ensure you have the Virtual Camera plugin enabled.
Add the VCam actor, found under Virtual Production.
As soon as you add the actor, you'll be presented with the view of the VCam as shown below:
As soon as the actor is added, it will start streaming. You can start and stop this via the Pixel Streaming Toolbar.
Once started, open a local browser and navigate to 127.0.0.1 to see your streamed display, or open the Live Link iOS application and navigate to your computer's IP address and hit connect.
If you want to interact with the stream through the browser, open the control panel in-browser and change the Control Scheme to Hovering.
Use Microphone
With Pixel Streaming, you can now allow in-engine playback of a particular peer / player microphone using WebRTC audio through the web browser.
Setting up Use Microphone in Project
Making your project microphone compatible is extremely simple and only requires a single addition to your project.
Enable the Pixel Streaming Plugin.
On any Actor in scene, add the
PixelStreamingAudio
component. You can leave its settings as the default.
Each audio component associates itself with a particular Pixel Streaming player/peer (using the the Pixel Streaming Player ID)
Using Microphone in Stream
Once your project is set up with the
PixelStreamingAudio
component, run your application as per usual for Pixel Streaming (packaged or standalone with Pixel Streaming launch args) and launch your signalling server.Connect to your signalling server via web browser.
Open the frontend settings panel and set
Use Mic
totrue
. Click Restart at the bottom to reconnect.Your browser may ask permission to use your microphone, ensure you allow access.
Speak into your microphone, you should hear your voice played back through the stream!
Pixel Streaming in Virtual Reality
Virtual Reality (VR) Pixel Streaming is a new feature that provides users with the means to connect to a VR-compatible application using Pixel Streaming. This allows users to enjoy a VR experience with their own headsets, without running a local application.
Setting Up the Project
For this example, we'll use the Virtual Reality template project.
Create a new project using the Virtual Reality template.
Enable the Pixel Streaming plugin and disable the OpenXR plugin. Restart the editor.
In the Content Browser, search for "Asset_Guideline" and delete "B_AssetGuideline_VRTemplate". When prompted, click Force Delete.
Now search for "VRPawn" in the Content Browser. Double-click the VRPawn to open it, then compile the Blueprint. If working properly it should compile successfully. Save and close this Blueprint.
Open Editor Preferences > Level Editor > Play and add
-PixelStreamingURL=ws://127.0.0.1:8888 -PixelStreamingEnableHMD
Creating the Required Certificates
You need a HTTPS certificate to use VR with Pixel Streaming. This is due to the fact that the standard for WebXR requires that the API is only available to sites loaded over a secure connection (HTTPS). For production use, you will need to use a secure origin to support WebXR. You can find extra information on these requriements here: https://developer.oculus.com/documentation/web/port-vr-xr/#https-is-required.
For this example, we'll be setting up a basic certificate via Gitbash. If you do no have Gitbash installed prior, head to this page here for steps on how to install Gitbash: https://www.atlassian.com/git/tutorials/git-bash.
Create a
certificates
folder inside theSignallingWebServer
directory, as shown below:Right click inside the
certificates
directory and open Gitbash. Type inopenssl req -x509 -newkey rsa:4096 -keyout client-key.pem -out client-cert.pem -sha256 -nodes
.Press Enter multiple times, until the command is complete. You'll know it's done when done as it will have created 2
.pem
files in the certificates folder.Open the
config.json
file found in theSignallingWebServer
folder, set theUseHTTPS
value totrue
.
You should now be ready to run and test your VR application!
The certificate created above is only for testing purposes. For full cloud deployment, you will need to organise a proper certificate.
Joining the VR Stream
For this example, we'll be using the Meta Quest 2.
Start the
Start_Signalling.ps1
script found in\SignallingWebServer\platform_scripts\cmd
Going back to the editor, run the application standalone. As you added the launch args in previous steps, it should connect to the signalling server once it's fully started up.
Now using your VR headset, open the web browser and enter your computers IP address. You'll be presented with a "Connection not secure" page, open the "Advanced" tab and click "Proceed to IP"
You should see the application streamed to two views in the browser window. Click the XR button on the left to switch to VR.
Done! You should now be in your Pixel Streamed VR project!