unreal.ARSessionConfig

class unreal.ARSessionConfig(outer: Object | None = None, name: Name | str = 'None')

Bases: DataAsset

An Unreal Data Asset that defines what features are used in the AR session.

C++ Source:

  • Module: AugmentedReality

  • File: ARSessionConfig.h

Editor Properties: (see get_editor_property/set_editor_property)

  • candidate_images (Array[ARCandidateImage]): [Read-Write] The list of candidate images to detect within the AR camera view. This feature is used by ARKit.

  • candidate_objects (Array[ARCandidateObject]): [Read-Write] The list of candidate objects to search for in the scene. This feature is used by ARKit.

  • default_mesh_material (MaterialInterface): [Read-Write] The default mesh material used by the generated mesh component.

  • default_wireframe_mesh_material (MaterialInterface): [Read-Write] The default mesh material used by the wireframe setting of the generated mesh component. Note: It is recommended to ignore this wireframe feature and use a wireframe material for the DefaultMeshMaterial instead.

  • desired_video_format (ARVideoFormat): [Read-Write] The desired video format (or the default, if not supported) that this session should use if the camera is enabled. Use GetSupportedVideoFormats to get a list of device-supported formats.

  • enable_auto_focus (bool): [Read-Write] Boolean to determine whether the camera should autofocus. Autofocus can cause subtle shifts in position for small objects at further camera distance. This feature is used by ARCore and ARKit.

  • enable_automatic_camera_overlay (bool): [Read-Write] Boolean to determine whether the AR camera feed should be drawn as an overlay. Defaults to true. This feature is used by ARCore and ARKit.

  • enable_automatic_camera_tracking (bool): [Read-Write] Boolean to determine whether the virtual camera should track the device movement. Defaults to true. This feature is used by ARCore and ARKit.

  • enabled_session_tracking_feature (ARSessionTrackingFeature): [Read-Write] see: EARSessionTrackingFeature

  • environment_capture_probe_type (AREnvironmentCaptureProbeType): [Read-Write] see: EAREnvironmentCaptureProbeType

  • environment_probe_component_class (type(Class)): [Read-Write] see: UAREnvironmentProbeComponent

  • face_component_class (type(Class)): [Read-Write] see: UARFaceComponent

  • face_tracking_direction (ARFaceTrackingDirection): [Read-Write] see: EARFaceTrackingDirection

  • face_tracking_update (ARFaceTrackingUpdate): [Read-Write] see: EARFaceTrackingUpdate

  • frame_sync_mode (ARFrameSyncMode): [Read-Write] see: EARFrameSyncMode

  • generate_collision_for_mesh_data (bool): [Read-Write] Boolean to determine whether the AR system should generate collision data from the mesh data.

  • generate_mesh_data_from_tracked_geometry (bool): [Read-Write] Boolean to determine whether the AR system should generate mesh data that can be used for rendering, collision, NavMesh, and more. This feature is used by OpenXR, Windows Mixed Reality.

  • generate_nav_mesh_for_mesh_data (bool): [Read-Write] Boolean to determine whether the AR system should generate collision data from the mesh data.

  • geo_anchor_component_class (type(Class)): [Read-Write] see: UARGeoAnchorComponent

  • horizontal_plane_detection (bool): [Read-Write] Boolean to determine whether flat, horizontal surfaces are detected. This feature is used by ARCore and ARKit.

  • image_component_class (type(Class)): [Read-Write] see: UARImageComponent

  • light_estimation_mode (ARLightEstimationMode): [Read-Write] see: EARLightEstimationMode

  • max_num_simultaneous_images_tracked (int32): [Read-Write] The maximum number of images to track at the same time. Defaults to 1. This feature is used by ARKit.

  • max_number_of_tracked_faces (int32): [Read-Write] The maximum number of faces to track simultaneously. This feature is used by ARKit.

  • mesh_component_class (type(Class)): [Read-Write] see: UARMeshComponent

  • object_component_class (type(Class)): [Read-Write] see: UARObjectComponent

  • plane_component_class (type(Class)): [Read-Write] see: UARPlaneComponent

  • point_component_class (type(Class)): [Read-Write] see: UARPointComponent

  • pose_component_class (type(Class)): [Read-Write] see: UARPoseComponent

  • qr_code_component_class (type(Class)): [Read-Write] see: UARQRCodeComponent

  • render_mesh_data_in_wireframe (bool): [Read-Write] Boolean to determine whether the AR system should render the mesh data as wireframe. It is reccomended to simply set the DefaultMeshMaterial to whatever is desired, including a wireframe material and ignore this setting (there is no good reason for this to exist as a special case).

  • reset_camera_tracking (bool): [Read-Write] Boolean to determine whether the AR system should reset camera tracking, such as its origin and transforms, when a new AR session starts. Defaults to true. This feature is used by ARKit.

  • reset_tracked_objects (bool): [Read-Write] Boolean to determine whether the AR system should remove any tracked objects when a new AR session starts. Defaults to true. This feature is used by ARKit.

  • scene_reconstruction_method (ARSceneReconstruction): [Read-Write] see: EARSceneReconstruction

  • session_type (ARSessionType): [Read-Write] see: EARSessionType

  • track_scene_objects (bool): [Read-Write] Boolean to determine whether the AR system should track scene objects: see: EARObjectClassification::SceneObject.

  • use_automatic_image_scale_estimation (bool): [Read-Write] Boolean to determine whether to automatically estimate and set the scale of a detected, or tracked, image. This feature is used by ARKit.

  • use_mesh_data_for_occlusion (bool): [Read-Write] Boolean to determine whether the AR system should render the mesh data as occlusion meshes.

  • use_optimal_video_format (bool): [Read-Write] Boolean to determine whether to automatically pick the video format that best matches the device screen size

  • use_person_segmentation_for_occlusion (bool): [Read-Write] Boolean to determine whether to use the person segmentation results for occluding virtual content. This feature is used by ARKit.

  • use_scene_depth_for_occlusion (bool): [Read-Write] Boolean to determine whether to use the scene depth information for occluding virtual content. This feature is used by ARCore and ARKit.

  • use_standard_onboarding_ux (bool): [Read-Write] Boolean to determine whether to use the standard onboarding UX, if the system supports it. This feature is used by ARKit.

  • vertical_plane_detection (bool): [Read-Write] Boolean to determine whether flat, vertical surfaces are detected. This feature is used by ARCore and ARKit.

  • world_alignment (ARWorldAlignment): [Read-Write] see: EARWorldAlignment

  • world_map_data (Array[uint8]): [Read-Only] A previously saved world that will be loaded when the session starts. This feature is used by ARKit.

add_candidate_image(new_candidate_image) None

Add a new CandidateImage to the ARSessionConfig.

Parameters:

new_candidate_image (ARCandidateImage) –

add_candidate_object(candidate_object) None

see: CandidateObjects

Parameters:

candidate_object (ARCandidateObject) –

clear_candidate_images() None

Remove all candidate images from the ARSessionConfig

get_candidate_image_list() Array[ARCandidateImage]

see: CandidateImages

Return type:

Array[ARCandidateImage]

get_candidate_object_list() Array[ARCandidateObject]

see: CandidateObjects

Return type:

Array[ARCandidateObject]

get_desired_video_format() ARVideoFormat

see: DesiredVideoFormat

Return type:

ARVideoFormat

get_enabled_session_tracking_feature() ARSessionTrackingFeature

see: EnabledSessionTrackingFeatures

Return type:

ARSessionTrackingFeature

get_environment_capture_probe_type() AREnvironmentCaptureProbeType

see: EnvironmentCaptureProbeType

Return type:

AREnvironmentCaptureProbeType

get_face_tracking_direction() ARFaceTrackingDirection

see: FaceTrackingDirection

Return type:

ARFaceTrackingDirection

get_face_tracking_update() ARFaceTrackingUpdate

see: FaceTrackingUpdate

Return type:

ARFaceTrackingUpdate

get_frame_sync_mode() ARFrameSyncMode

see: FrameSyncMode

Return type:

ARFrameSyncMode

get_light_estimation_mode() ARLightEstimationMode

see: LightEstimationMode

Return type:

ARLightEstimationMode

get_max_num_simultaneous_images_tracked() int32

see: MaxNumSimultaneousImagesTracked

Return type:

int32

get_plane_detection_mode() ARPlaneDetectionMode

see: PlaneDetectionMode

Return type:

ARPlaneDetectionMode

get_scene_reconstruction_method() ARSceneReconstruction

see: SceneReconstructionMethod

Return type:

ARSceneReconstruction

get_session_type() ARSessionType

see: SessionType

Return type:

ARSessionType

get_world_alignment() ARWorldAlignment

see: EARWorldAlignment

Return type:

ARWorldAlignment

get_world_map_data() Array[uint8]

see: WorldMapData

Return type:

Array[uint8]

remove_candidate_image(candidate_image) None

Remove a candidate image from the ARSessionConfig, by pointer, note the image object must match, not the content of the image.

Parameters:

candidate_image (ARCandidateImage) –

remove_candidate_image_at_index(index) None

Remove a candidate image from the ARSessionConfig, by index.

Parameters:

index (int32) –

set_candidate_object_list(candidate_objects) None

see: CandidateObjects

Parameters:

candidate_objects (Array[ARCandidateObject]) –

set_desired_video_format(new_format) None

see: DesiredVideoFormat

Parameters:

new_format (ARVideoFormat) –

set_enable_auto_focus(new_value) None

see: bEnableAutoFocus

Parameters:

new_value (bool) –

set_face_tracking_direction(direction) None

see: FaceTrackingDirection

Parameters:

direction (ARFaceTrackingDirection) –

set_face_tracking_update(update) None

see: FaceTrackingUpdate

Parameters:

update (ARFaceTrackingUpdate) –

set_reset_camera_tracking(new_value) None

see: bResetCameraTracking

Parameters:

new_value (bool) –

set_reset_tracked_objects(new_value) None

see: bResetTrackedObjects

Parameters:

new_value (bool) –

set_scene_reconstruction_method(scene_reconstruction_method) None

see: SceneReconstructionMethod

Parameters:

scene_reconstruction_method (ARSceneReconstruction) –

set_session_tracking_feature_to_enable(session_tracking_feature) None

see: EnabledSessionTrackingFeatures

Parameters:

session_tracking_feature (ARSessionTrackingFeature) –

set_world_map_data(world_map_data) None

see: WorldMapData

Parameters:

world_map_data (Array[uint8]) –

should_enable_auto_focus() bool

see: bEnableAutoFocus

Return type:

bool

should_enable_camera_tracking() bool

see: bEnableAutomaticCameraTracking

Return type:

bool

should_render_camera_overlay() bool

see: bEnableAutomaticCameraOverlay

Return type:

bool

should_reset_camera_tracking() bool

see: bResetCameraTracking

Return type:

bool

should_reset_tracked_objects() bool

see: bResetTrackedObjects

Return type:

bool