Getting Started with Pixel Streaming in Unreal Engine

Get up and running streaming an Unreal Engine application from one computer to other computers and mobile devices on the same network.

Choose your operating system:

Windows

macOS

Linux

Follow the steps below to stream the rendered output from your Unreal Engine Project over your local network to browsers and mobile devices.

The images for the steps on this page illustrate the procedure using a Project built from the Third-Person Blueprint template. However, the same steps should work for any Unreal Engine Project.

Prerequisites

  • Check your OS and hardware - The Pixel Streaming Plugin can only encode video on computers running Windows and Linux operating systems, with certain specific types of GPU hardware. For details, see the Pixel Streaming Reference.

  • Install node.js - If you don't already have node.js installed on your computer, you'll need to download and install it.

  • Open network ports - Make sure you have the following network ports open for communication on your local network: 80, 8888. If you need to change these defaults, see the Pixel Streaming Reference.

  • Stop other web servers - If your computer is running any other Web servers, stop them for now.

  • IP addresses - You'll need to know the IP address of your computer if you intend to test Pixel Streaming over the internet.
    However it's a good idea to get started with Pixel Streaming within a LAN or VPN first, which means you can use localhost or 127.0.0.1 as your Pixel Streaming IP address. If you are trying to connect from a machine on a different network, you'll likely need to configure your signalling server to use a STUN/TURN server. See the Pixel Streaming Reference page for details on how to configure your signalling server with peerConnectionOptions that specify a STUN/TURN server.

1 - Prepare Your Unreal Engine Application

In this step, you will create a standalone executable file for your Project.

  • The Pixel Streaming Plugin only works when you run your Project as a packaged application, or when you launch it from the Unreal Editor using the Standalone Game option.

  • In order for the Pixel Streaming Plugin to extract and stream audio from your application, you need to start the Unreal Engine with a special command-line flag: -AudioMixer. The procedure below shows how to set this up for both scenarios.

  1. Open your Project in the Unreal Editor.

  2. From the main menu in the Unreal Editor, select Edit > Plugins.

  3. Under the Graphics category, find the Pixel Streaming Plugin and check its Enabled box.
    Enable the Pixel Streaming plugin

  4. Click Restart Now to restart your Project and apply the change.
    Restart now

  5. Back in the Unreal Editor, choose Edit > Project Settings from the main menu. 

  6. If your Project involves a character, and you want to enable input from touch devices such as phones and tablets to move that character around the Level, you may want to show the on-screen touch controllers.
    Under the Engine > Input category, find and enable the Always Show Touch Interface setting.
    Always Show Touch Interface
    This is optional, and not required for all Projects. However, for Projects like the Third-Person Template, this makes sure that users with touch devices can control the streamed application (as long as the Project's Player Controller supports touch input).

  7. From the main menu, choose Edit > Editor Preferences...

  8. Under the Level Editor > Play category, find the Additional Launch Parameters setting, and set its value to -AudioMixer -PixelStreamingIP=localhost -PixelStreamingPort=8888.
    Additional Launch Parameters

  9. Package your Project for Windows. From the main menu in the Unreal Editor, choose Files > Package Project > Windows (64-bit).
    Package for Windows 64-bit

  10. Browse to the folder on your computer where you want the Unreal Editor to place the packaged version of your Project, and click Select Folder.
    Select a folder

  11. The Unreal Editor begins the packaging process.
    Packaging progress indicator

  12. When the packaging process is finished, go to the folder that you selected in step 6 above. You'll find a folder called Windows with contents similar to the following:
    Packaged output

  13. Every time you start your packaged application, you need to pass it the same command-line flags set in step 8 above. One way to do this is to set up a shortcut:

    1. Press Alt and drag your .exe file to create a new shortcut in the same folder (or anywhere else you like on your computer).
      Create a shortcut

    2. Right-click the shortcut and choose Properties from the context menu.
      Shortcut properties

    3. On the Shortcut tab of the Shortcut Properties window, append the text -AudioMixer -PixelStreamingIP=localhost -PixelStreamingPort=8888 at the end of the Target field, and click OK.
      Command line parameters

Once you've gotten the Pixel Streaming system up and running, you may also want to add the -RenderOffScreen command-line parameter. If your Unreal Engine application window ever gets accidentally minimized, the Pixel Streaming video and input streams will stop working. -RenderOffScreen avoids this possibility by running the application in a headless mode without any visible window.

End Result

You now have a packaged, standalone Unreal Engine application that has the Pixel Streaming Plugin enabled, ready to stream its rendered frames and audio.

2 - Get the Pixel Streaming Servers

Recent changes to Pixel Streaming have moved the front end and web server elements of Pixel Streaming to an external repository. We refer to this as the Pixel Streaming Infrastructure.

There are a few ways to access the Pixel Streaming infrastructure.

  1. Directly access the github repository as found here: https://github.com/EpicGames/PixelStreamingInfrastructure

  2. Use git clone --branch UE5.1 https://github.com/EpicGames/PixelStreamingInfrastructure.git in your preferred terminal (make sure you have git installed).

  3. Navigate to \Engine\Plugins\Media\PixelStreaming\Resources\WebServers and run the get_ps_servers command (make sure to use the .bat script for Windows and .sh script for Linux accordingly). This will automatically pull the relevant branch of the Pixel Streaming infrastructure into that folder.

The git command mentioned above will pull the 5.1 branch of the infrastructure. If you need a different branch, please modify the git command accordingly.

For more information about the Pixel Streaming front end and webserver changes, see Pixel Streaming Infrastructure

3 - Start the Servers

In this step, you will start the web services that will help to establish peer-to-peer connections between your Unreal Engine application and clients' browsers. If you have not done the previous step, you will not have access to these servers.

The following steps assume you are using Windows. However, Linux is the same process except that you run the scripts in the SignallingWebServer\platform_scripts\bash folder instead.

  1. In the location you pulled the Pixel Streaming Infrastructure, find the location of the Signalling Server under the folder SignallingWebServer.

    SSLocal.JPG

  2. To prepare for the Signalling Server, start by opening PowerShell as Administrator and running SignallingWebServer\platform_scripts\cmd\setup.ps1. This will install all the required dependencies.

  3. Start the Signalling Server by running SignallingWebServer\platform_scripts\cmd\Start_SignallingServer.ps1. When the server has started and is ready to accept connections, you'll see the following lines in the console window:

    WebSocket listening to Streamer connections on :8888
    WebSocket listening to Players connections on :80
    Http listening on *: 80
  4. Now, start the Unreal Engine application from the shortcut that you created in the previous section. Or, if you prefer to launch your application from the command line, execute the following command:

    MyPixelStreamingApplication.exe -PixelStreamingIP=127.0.0.1 -PixelStreamingPort=8888

For convenience, when you package your Unreal Engine application, these servers are also copied to the folder that contains your packaged executable. You'll find them under the Engine sub-folder, at the same paths indicated above. You can launch the servers from there instead of launching them from your Unreal Engine installation folder.
However, remember that if you need to modify any files in these folders, particularly the player page or configuration file for the Signalling and Web Server, you should modify them in the original location. If you modify them in your package folder, your changes may be overwritten the next time you package your application.

End Result

When the Unreal Engine application connects to the Signalling and Web Server, you should see the following line of output in the console window opened by the Signalling and Web Server:

Streamer connected: ::1

This means that you now have the Unreal Engine application running with the Pixel Streaming Plugin enabled, and the front-end Signalling and Web Server is ready to route connecting clients to the Unreal Engine application.

If necessary, you can stop and restart the Unreal Engine application and the Signalling and Web Server independently. As long as they're both running at the same time, they should be able to reconnect automatically.

At this point, you have everything you need set up and working on your computer. All that's left is to connect a browser.

4 - Connect!

In this step, you'll connect Web browsers running on multiple different devices to your Pixel Streaming broadcast.

  1. On the same computer that is running your Unreal Engine application, Alt-Tab to switch the focus away from the Unreal Engine application, and start a supported Web browser (Google Chrome and Mozilla Firefox are safe choices).

  2. In the address bar, navigate to http://127.0.0.1. This IP address of the local machine, so the request should be served by the Signalling Server:
    Connect to the localhost

  3. Click the page to connect, then click again on the Play button to start the stream. 

  4. You'll now be connected to your application, and you should see the rendered output streaming into the middle of the player Web page:
    Media streaming to localhost
    The default player page is already set up to forward keyboard, mouse, and touchscreen input to the Unreal Engine, so you can control the application and navigate around exactly the way you would if you were controlling the app directly. 

  5. Click the Add (+) button at the left of the window to expand some built-in options for controlling the stream:

    Setting

    Description

    Enlarge Display to Fill Window

    Determines whether the media player should resize to fit the current size of the browser window, or whether it should remain at a fixed size and position.

    Is Quality Controller?

    Makes the encoder of the Pixel Streaming Plugin use the current browser connection to determine the bandwidth available, and therefore the quality of the stream encoding.
    Although Pixel Streaming adapts the quality of the stream to match the available bandwidth, the video frames are only encoded once by the Pixel Streaming Plugin. That one encoding is used for all clients. Therefore, only one client connection can "own" the quality used for adaptive streaming. If the other clients have a much better connection to the server, they may end up seeing a lower quality stream than necessary. On the other hand, if other clients have a much worse connection to the server, they may end up with lag or jitter.
    By default, each time a new browser connects, it adopts the ownership of the stream. Use this checkbox from any other connected browser to retake ownership.

    Match Viewport Resolution

    Visualizes statistics about the connection between the browser and the Unreal Engine application.

    Offer to Receive

    The browser will start the WebRTC handshake instead of the Unreal Engine application. This is an advanced setting for people customising the front end.

    Prefer SFU

    Will attempt to use the Selective Forwarding Unit, will only work if you have one running.

    Use Microphone

    Will start receiving audio input from your microphone and play it back through the stream. This currently only works locally.

    Force Mono Audio

    Will start sending all audio as mono instead of stereo.

    Force TURN

    Will attempt to connect exclusively via the TURN server. Will not work without an active TURN server.

    Control Scheme

    Will dictate if the stream captures your mouse or keeps it free.

    Hide Browser Cursor

    Will toggle visibility of your cursor while it's over the stream. Useful if you are using a custom cursor in your project.

    Show FPS

    Will display the current FPS

    Request Keyframe

    Will ask the stream for a keyframe. This is helpful if your stream is choppy and needs to catch up.

    Encoder Settings

    Allows you to specify a min and max QP. Lower numbers mean higher quality, but higher bitrate. Max value is 51. -1 will disable any hard limits.

    WebRTC Settings

    Allows you to specify your desired FPS that WebRTC streams, as well as a min and max bitrate. Try not to set this too high as WebRTC may start dropping frames.

    Stream Settings

    Currently experimental, will be used to support multiple video tracks/streams in future.

    Restart Stream

    Restarts the peer connection to the stream. Should be used if you change above settings to ensure they are applied.

    See the contents of the player.html and app.js files under the Signalling Web Server folder to find out how these controls are implemented.

  6. Now, find other computers and/or mobile devices in your network. Repeat the same steps, but instead of using 127.0.0.1, direct the browser to the IP address of the computer running the Unreal Engine application and the Signalling Server.
    Media streaming to remote host

End Result

You now have one instance of the Unreal Engine running on your computer, broadcasting a media stream to multiple devices over your local network. Each connected device sees the same view of the same Level, all rendered on the same original desktop PC.

By default, all connected devices share control over the Unreal Engine application, forwarding all keyboard, mouse, and touchscreen inputs.

pixelstreaming-endresult-chrome.png

pixelstreaming-endresult-iossafari.png

pixelstreaming-endresult-gpixel.png

Desktop

iPhone

Android

5 - On Your Own

The steps above walk you through a relatively simple setup that uses a single server host and a default player page. With a little more effort, you can take the Pixel Streaming system much farther. For example:

  • You can completely redesign the player HTML page to meet the needs of your Project. Control who can send input to the Unreal Engine application, and even create HTML5 UI elements on the page that emit custom gameplay events to the Unreal Engine.
    For details, see Customizing the Player Web Page. For a working example, see the Pixel Streaming Demo available in the Learn tab of the Epic Games Launcher.

  • If you need to provide pixel streaming services over the open Internet, or across subnets, you will likely need to do some more advanced network configuration.  Or, you may prefer to have each connecting client stream content from a separate instance of the Unreal Engine, or through a separate player page that offers different controls.
    For details on topics like these, see the Hosting and Networking Guide.

  • Each component of the Pixel Streaming system has a number of configuration properties that you can use to control encoding resolution, screen size, IP addresses and communication ports, and more.
    For information on all these properties and how to set them, see the Pixel Streaming Reference.

  • To check out new experimental features in Pixel Streaming, see the Experimental Pixel Streaming Features page.

  • The Stream Tuning Guide page can help provide extra control over your stream quality and settings.