Building Texture Streaming Data

For a given primitive component, the data required to compute accurately the required resolution of each used texture comes from:

  • The bounds of the primitive

  • The texture coordinate sizes of the mesh

  • The material texture coordinate scale for sampling each texture

When this information is missing, conservative heuristics will be used.

To build Texture Streaming data:

  1. Click on the dropdown arrow next to the Build button in the Toolbar.

  2. Click on Build Texture Streaming


    This will generate component and level data to be used at runtime.


Note that this generates the data to be used for all material quality levels and platform feature levels, using the highest quality level and supported feature level.

At runtime, if the component is using fewer textures than were used in the build process, the unused ones will get ignored. If your project uses textures for which there is no built data, those textures will use the conservative heuristics.

Build Accuracy

The accuracy of the data computed during the texture streaming build can be inspected by looking at the texture streaming accuracy view modes:


Those will show the accuracy of the different built data. When something is completely off (red or green), it is sometimes possible to correct the data with manual configuration changes . The texture streaming build aims at generating the best data without requiring manual tweaks.


The above image is the scene used to build the texture data shown below. For all the view modes, use the following legend:

  • Red: 2X+ mips missing

  • Yellow: 1 mip missing

  • White: Good data

  • Cyan: 1 extra mip

  • Green: 2X+ extra mips

Primitive Distance Accuracy

This shows the accuracy of the view-mesh distance computed by the streamer compared to the real GPU distance. The streamer computes the distance from the viewpoint to the texture instance axis-aligned bounding box (AABB). This bounding box is computed in the build process by summing up the component LOD-section AABBs that are using the texture.

When the viewpoint gets very close to or enters this AABB, this mode will always shows as 2X+ over unless the geometry is fitting the AABB. This is because the streamer computed distance becomes 0 while the GPU distance is not. The ratio between the two becomes very big. This is not an error and there is no need to try to fix it.


The value displayed here can be adjusted by changing the mesh component's Streaming Distance Multiplier.


Mesh UV Density Accuracy

This view mode shows the accuracy of the mesh world texcoord size used by the streamer compared to the real GPU value. This size relates the variation of texcoords by unit of world space. It is used by the streamer to evaluate the impact of the texcoord on the sampling of the texture.


This view mode is relatively viewpoint agnostic, and when a mesh has bad values, they are usually not scene related but rather mesh related. That means that if a mesh has wrong data, any component using that mesh will probably have bad data. This value can be changed within the Static Mesh or Skeletal Mesh Editors by tweaking the StreamingDistanceMultiplier until the mesh is in the Good range.

Material Texture Scale Accuracy

Most of the textures used in a material are sampled using a scaled value of one of the Mesh UV Densities. The texture streaming build tries to compute which texture coordinate is used and what scale is applied to each sampled texture. This could fail for many reasons, in which case the streamer falls back on assuming the texture was sampled with texcoord 0 and a scale of 1.


In addition to the standard legend, a black and white checkerboard indicates that the build could not generate the required shader for some reason. Also, because the material samples many textures, this view mode shows the worst error (in term of under-streaming and over-streaming) across all textures sampled. Those two extrema values are displayed through the checkerboard look: one shows the worst oversampling while the other shows the worst undersampling.

Individual errors per texture can be investigated by using the console command r.Streaming.AnalysisIndex X, where X is a value between 0 and 31.