Project Anywhere XR

With Unreal Engine and the HoloLens 2, learn how to use geospatial technology and data streaming with large datasets in augmented reality through the Project Anywhere XR sample.

Choose your operating system:

Windows

macOS

Linux

Project Anywhere XR is a proof of concept to demonstrate geospatial technology, global 3D terrain, and data streaming to visualize the world as 3D tiles in Unreal Engine. This version of Project Anywhere demonstrates that it's possible to use large-scale datasets in a HoloLens 2 application.

This project leverages the following plugins to bring the world to your tabletop in augmented reality:

Prerequisites

Before getting started, make sure you have the following set up:

You can find the required 3rd party plugins on the Unreal Engine Marketplace . If you launch the project without them, you will be asked to download and install them.

If you don't have a HoloLens 2 device, you can still run the project with in-editor emulation using Microsoft's Input Simulation included in the Mixed Reality UXTools plugin.

Project Setup

Follow these steps to set up a project with the Project Anywhere XR sample content and connect the assets to your Cesium Ion account.

  1. Download the Project Anywhere XR sample from the Epic Games Launcher and create a new project in the Unreal Editor.

  2. Log in to your Cesium Ion account.

  3. Go to the Asset Depot tab, and add the following 3D Tiles assets:

    • Aerometrex Denver Photogrammetry (#354307)

    • Bing Maps Aerial (#2)

    • Cesium OSM Buildings (#96188)

    • Cesium world terrain (#1)

    • Melbourne Photogrammetry (#69380)

    Read the conditions carefully. Some of them are for non-commercial purposes only.

    image alt text

  4. Go to the Access Tokens tab.

    image alt text

  5. Click Create Token to open the Create token panel.

    image alt text

  6. In the Create token panel:

    • Set the Name field to Project Anywhere XR

    • Under Scope , enable the following options:

      • assets:list

      • assets:read

      • profile:read

      • assets:write

      • geocode

    • Under Resources , select the radio button for Selected assets to show the list of Available Assets . It is a better practise to enable the token to the only set of assets you're using in the application

    • In the Available Assets list, select the following assets:

      • Aerometrex Denver Photogrammetry (#354307)

      • Bing Maps Aerial (#2)

      • Cesium OSM Buildings (#96188)

      • Cesium world terrain (#1)

      • Melbourne Photogrammetry (#69380)

    • Click Create to create the token.

    image alt text

  7. Select Project Anywhere XR and copy the text under Token .

    image alt text

  8. In the Unreal Editor's World Outliner , select the following Cesium3DTileset Actors:

    • Cesium OSM Buildings

    • Cesium World Terrain

    • Denver

    • Melbourne Photogrammetry

  9. The Details panel will show common properties for each of these Actors. Find the Ion Access Token property, then paste the token you generated from your Cesium Ion account into it.

    image alt text

  10. Select the Cesium World Terrain Actor and select its Bing Maps Aerial Component.

  11. In the Details panel, paste the token you generated from your Cesium Ion account in the Ion Access Token property.

    image alt text

  12. After you set the token for the four Cesium3DTileset Actors and the Bing Maps Aerial Component, terrain data appears in the viewport.

    image alt text

  13. In the Toolbar, click Save Current to save your changes in the Level.

Viewing the Project

This application was designed for streaming content from a computer to the HoloLens 2 device through Microsoft's Holographic Remoting Player . By streaming content, you can load large datasets on your computer to create the render, then send the rendered frames to the device over the network. If you package the project as an application and deploy to the device, your datasets must fit the memory capacity of the device.

For instructions on how to stream to device from the Unreal Editor with the Holographic Remoting Player, refer to Streaming to device with the Holographic Remoting Player .

When streaming to the device, increase the network transfer rate to 15000 kb / s to improve the experience with higher quality images.

image alt text

If you don't have a HoloLens 2 device connected to your computer, you can view the project with in-editor emulation and keyboard and mouse input. To enable the in-editor emulation:

  1. In the main menu, choose Edit > Project Settings to open the Project Settings window.

  2. In the Project Settings window under Platforms , select Windows Mixed Reality.

  3. In the Holographic Remoting section, set Enable Input Simulation to True .

    image alt text

Refer to Microsoft's documentation on Input Simulation for more details on how to use the in-editor emulation.

In-editor emulation of this project has the following limitations:

  • In-editor emulation does not support calibration. To automatically place the tabletop, calibration requires a spatial mesh, which the application creates when it scans your environment.

  • Applications do not receive pinch interaction events used for selecting Bookmarks when playing in-editor, but you can emulate these interactions for both hands using keys 1 and 2 on your keyboard.

When the project launches, a virtual table top appears. A section of the world displays on the table as a map with the following information:

  • The geographical coordinates for the center of the map, presented in the format: latitude / longitude / altitude

  • The real-world length of the area displayed in the map

image alt text

Using the Hand Menu

The project includes a menu that you can access by opening one of your hands in front of you. The menu contains a model of the Earth with push pins marking specific locations that you can select to update the map, a slider to rotate the Earth model, and buttons to move the table or learn about the data displayed. The menu is defined in the Blueprint BP_HandMenu , using the UXTool Hand Menu for HoloLens 2.

image alt text

The sections below describe how to interact with the menu and how it was created.

Go to Specific Location with Bookmarks

The Earth model has push pins marking specific locations in the world that you can select and view on the tabletop.

To select a location, move your hand close to a pushpin until it turns red and a tooltip with the location name is displayed. Pinch with your fingers to make the tabletop update its map with this location.

image alt text

To rotate the Earth model locally arounds its axis, pinch and drag the slider below the model, and release the slider to stop the rotation.

image alt text

When the app starts, the pushpins are dynamically spawned based on the entries in the bookmarks data table DT_Bookmarks_PAXR in the BP_HandMenu Actor. The pushpin blueprint BP_Bookmark_Button , which is a custom UxtPressableButton , is attached to the sphere component so the pushpins rotate with the Earth automatically.

The following buttons give information about the data used and the ability to move the tabletop around in your environment:

Menu Button

Description

Data Attribution

You can read the Data Attribution for the displayed map by clicking on the Data Attribution button in your hand menu.

image alt text

Calibrate

When you first launch the app, the holographic table displays at a default location in your room. You can press this button to automatically reposition the table directly in front of your view. Refer to the following section Moving the Virtual Tabletop In Your Room for more details.

Move Up

Incrementally move the position of the table up along the Z axis.

Move Down

Incrementally move the position of the table down along the Z axis.

Rotate Left

Incrementally rotate the table left around the Z axis.

Rotate Right

Incrementally rotate the table right around the Z axis.

Explore the BP_HandMenu Blueprint

The menu is defined in the Blueprint BP_HandMenu, based on the UXTool Hand Menu for HoloLens 2. The menu shows and hides when UxtPalmUpConstraint is activated or deactivated. Button interactions are redirected to the appropriate actions in BP_AnywhereXRPlayerController . On BeginPlay, BP_HandMenu initializes the following:

  • Stores a reference to the BP_AnywhereXRPlayerController in the variable MRAnywhere Player Controller .

  • Creates and hides the Data Attribution Widget.

  • Binds all menu events to their corresponding event handlers.

  • Binds the slider event to the Earth model's rotation.

  • Reads all entries in the DT_Bookmarks_PAXR data table and uses them to spawn corresponding bookmarks.

Moving the Virtual Tabletop in your Room

When you launch the application for the first time, the virtual tabletop is displayed at a default location in your room. You can change the position to a better location in your room, preferably on a dark table, either through the Calibrate button in the menu or with the Move and Rotate buttons in the menu. When you place the table, ARPins anchor it to the world.

In the Hand Menu, press the Calibrate button to begin calibration. This will hide the current tabletop, then replace it with a new virtual tabletop for the calibration process. The new tabletop will have the same shape and size as the previous one, and it will be oriented in the viewing direction.. The vector between your head location and the anchor point determines the direction of North on the tabletop's map.

To confirm the location of the tabletop and the direction of North:

  • Move the calibration tabletop to the location of your choice by moving your head. The spatial mesh of your room is used during this operation.

  • Tap again to release the calibration tabletop and set the table location.

image alt text

If you want to fine tune the location of the tabletop, press the Move and Rotate buttons in the menu to make incremental changes.

image alt text

The Blueprint BP_AnywhereXRPlayerController contains the logic for calibration and moving the table. The following section explains some of the functions used for calibration.

Calibration Functions

The StartTable Calibration function is called from the Hand Menu's Calibrate button. When called, it performs the following actions:

  • Spawns BP_AnchorGizmo, and automatically goes in calibration mode, following device direction.

  • Binds itself to the Anchor Placed event triggered by BP_AnchorGizmo's Tap To Place.

  • Disables the virtual tabletop so it's hidden and has no collision.

Blueprint graph of the StartTable Calibration function. Click image for full size.

The OnAnchorPlaced function is called when the user places a BP_AnchorGizmo. When called, it performs the following actions:

  • Takes the transform of the confirmed location and calls the function ChangeTableOrigin.

  • Destroys BP_AnchorGizmo.

  • Enables the tabletop at its new location so it's no longer hidden and has collision again.

Blueprint graph of the OnAnchorPlaced function. Click image for full size.

The ChangeTableOrigin function is called by OnAnchorPlaced. When called, it performs the following actions:

  • Calls a specific Teleport function on the table, to ensure every dependent object is relocated appropriately.

  • Stores the location of the ARPin to the local store.

Blueprint graph of the ChangeTableOrigin function. Click image for full size.

Interacting with the Map on the Virtual Table

The map on the virtual table supports hand interactions to pan and zoom. Rotation was not implemented, so the direction of North will not change for the map. Visual hints appear on the map letting you know you can interact with it. The interactions trace the flat table geometry, instead of the terrain.

image alt text

To pan the map, pinch with one hand and move your hand in the direction you want to move the map. Release your pinch to release the map.

image alt text

To zoom in and out on the map, pinch with both hands and move your hands away from each other to zoom out, and together to zoom in. Release your pinch to release the map.

image alt text

Interaction logic for the map is defined in the BP_HolographicTable. The following considerations informed the design choices when the team implemented the map interactions:

  • HoloLens manipulations work only for a single object, but Cesium 3D Tilesets are made of several actors. The application intercepts the HoloLens manipulation interactions while keeping the table fixed, then applies the input translations and scale actions to the tilesets accordingly.

  • Since the Earth is round, if you pan it forever, the horizontal motion will eventually take you away from the surface of the Earth. At small scales, this was not a problem, but when zoomed out to max scale, it is noticeable. The pan motion needs to take into account the curve of the Earth, but applying this curve at each step can be expensive.

    To solve this, as soon as the motion exceeds a set threshold, the application relocates the origin of the tileset actors to the center of the table. The effect is not noticeable, and ensures that the computations are still correct.

  • Since the map can show mountains, the height between the ellipsoid and the ground surface is a variable. If you zoom in and keep your ellipsoid origin on the table center, the ground surface floats too far above it. This offset is different at each location of the Earth. The tilesets could be moved vertically to make sure they always fit on the table, but there is no way to know the minimum altitude on any square of land on Earth.

    To solve this, the application determines ground height through line traces, then applies a control algorithm to make it stick to the table, regardless of the current location or scale.

  • The HoloLens 2 interactions collide against any visible geometry. By default, the terrain displayed in the map would block raycasts from the HoloLens 2 and prevent them from hitting the table.

    To solve this, there is a custom collision channel called TileSets set up in Project Settings. The collision properties for each Cesium3DTiles only block collisions using the TileSets channel, but ignore the visibility channel for them.

Because of these design choices, when you interact with the virtual tabletop it will get a list of all 3D tiles, move them horizontally and vertically, scale them to apply a zoom effect, and over a displacement threshold relocate everything to make sure the map stays horizontal.

Known Issues

The following are known issues you might encounter and workarounds for them:

  • If you package the application for Shipping and receive a Fatal Error message at startup, there are two ways to work around this issue:

    • Close all Unreal Editor instances that are using the Cesium For Unreal plugin, and remove the cesium-request cache files located in %LOCALAPPDATA%\UnrealEngine\4.27

    • Package the app for Development .

  • If you preview the application using the editor with VR Preview and the hand tracking works but interactions are not registered, restart both the HoloLens 2 device and the Unreal application.

Help shape the future of Unreal Engine documentation! Tell us how we're doing so we can serve you better.
Take our survey
Dismiss