unreal.ARSessionConfig
¶
-
class
unreal.
ARSessionConfig
(outer=None, name='None')¶ Bases:
unreal.DataAsset
ARSession Config
C++ Source:
Module: AugmentedReality
File: ARSessionConfig.h
Editor Properties: (see get_editor_property/set_editor_property)
candidate_images
(Array(ARCandidateImage)): [Read-Write] The list of candidate images to detect within the AR camera viewcandidate_objects
(Array(ARCandidateObject)): [Read-Write] A list of candidate objects to search for in the scenedefault_mesh_material
(MaterialInterface): [Read-Write] The default mesh material used by the generated mesh componentdefault_wireframe_mesh_material
(MaterialInterface): [Read-Write] The default mesh material used by the wireframe setting of the generated mesh component. Note: It is reccomended to ignore this ‘wireframe’ feature and use a wirefraem material in the DefaultMeshMaterial if you want wireframe.desired_video_format
(ARVideoFormat): [Read-Write] The desired video format (or the default if not supported) that this session should use if the camera is enabled Note: Call GetSupportedVideoFormats to get a list of device supported formatsenable_auto_focus
(bool): [Read-Write] Whether the camera should use autofocus or not (can cause subtle shifts in position for small objects at macro camera distance)enable_automatic_camera_overlay
(bool): [Read-Write] Whether the AR camera feed should be drawn as an overlay or not. Defaults to true.enable_automatic_camera_tracking
(bool): [Read-Write] Whether the game camera should track the device movement or not. Defaults to true.enabled_session_tracking_feature
(ARSessionTrackingFeature): [Read-Write] A list of session features to enableenvironment_capture_probe_type
(AREnvironmentCaptureProbeType): [Read-Write] How the AR system should handle texture probe capturingenvironment_probe_component_class
(type(Class)): [Read-Write] Environment Probe Component Classface_component_class
(type(Class)): [Read-Write] Face Component Classface_tracking_direction
(ARFaceTrackingDirection): [Read-Write] Whether to track the face as if you are looking out of the device or as a mirrorface_tracking_update
(ARFaceTrackingUpdate): [Read-Write] Whether to track the face as if you are looking out of the device or as a mirrorframe_sync_mode
(ARFrameSyncMode): [Read-Write] EARFrameSyncMode:generate_collision_for_mesh_data
(bool): [Read-Write] Whether the AR system should generate collision data from the mesh data or notgenerate_mesh_data_from_tracked_geometry
(bool): [Read-Write] Whether the AR system should generate mesh data that can be rendered, collided against, nav mesh generated on, etc.generate_nav_mesh_for_mesh_data
(bool): [Read-Write] Whether the AR system should generate navigation mesh data from the mesh data or notgeo_anchor_component_class
(type(Class)): [Read-Write] Geo Anchor Component Classhorizontal_plane_detection
(bool): [Read-Write] Should we detect flat horizontal surfaces: e.g. table tops, windows sillsimage_component_class
(type(Class)): [Read-Write] Image Component Classlight_estimation_mode
(ARLightEstimationMode): [Read-Write] EARLightEstimationMode:max_num_simultaneous_images_tracked
(int32): [Read-Write] The maximum number of images to track at the same time. Defaults to 1max_number_of_tracked_faces
(int32): [Read-Write] The maximum number of faces to track simultaneously.mesh_component_class
(type(Class)): [Read-Write] Mesh Component Classobject_component_class
(type(Class)): [Read-Write] Object Component Classplane_component_class
(type(Class)): [Read-Write] Class binding for to facilitate networkingpoint_component_class
(type(Class)): [Read-Write] Point Component Classpose_component_class
(type(Class)): [Read-Write] Pose Component Classqr_code_component_class
(type(Class)): [Read-Write] QRCode Component Classrender_mesh_data_in_wireframe
(bool): [Read-Write] Whether the AR system should render the mesh data in wireframe or notreset_camera_tracking
(bool): [Read-Write] Whether the AR system should reset camera tracking (origin, transform) or not. Defaults to true.reset_tracked_objects
(bool): [Read-Write] Whether the AR system should remove any tracked objects or not. Defaults to true.scene_reconstruction_method
(ARSceneReconstruction): [Read-Write] Which scene reconstruction method to usesession_type
(ARSessionType): [Read-Write] EARSessionType:track_scene_objects
(bool): [Read-Write] Whether the AR system should report scene objects ( EARObjectClassification::SceneObject):use_automatic_image_scale_estimation
(bool): [Read-Write] Whether to automatically estimate and set the scale of a detected or tracked image.use_mesh_data_for_occlusion
(bool): [Read-Write] Whether the AR system render the mesh data as occlusion meshes or notuse_optimal_video_format
(bool): [Read-Write] Whether to automatically pick the video format that best matches the device screen sizeuse_person_segmentation_for_occlusion
(bool): [Read-Write] Whether to occlude the virtual content with the result from person segmentationuse_scene_depth_for_occlusion
(bool): [Read-Write] Whether to occlude the virtual content with the scene depth informationuse_standard_onboarding_ux
(bool): [Read-Write] Whether to use the standard onboarding UX, if the system supports it.vertical_plane_detection
(bool): [Read-Write] Should we detect flat vertical surfaces: e.g. paintings, monitors, book casesworld_alignment
(ARWorldAlignment): [Read-Write] EARWorldAlignment:world_map_data
(Array(uint8)): [Read-Only] A previously saved world that is to be loaded when the session starts
-
add_candidate_image
(new_candidate_image) → None¶ Add a new CandidateImage to the ARSessionConfig.
- Parameters
new_candidate_image (ARCandidateImage) –
-
add_candidate_object
(candidate_object) → None¶ CandidateObjects:
- Parameters
candidate_object (ARCandidateObject) –
-
get_candidate_image_list
()¶ CandidateImages:
- Returns
- Return type
-
get_candidate_object_list
()¶ CandidateObjects:
- Returns
- Return type
-
get_desired_video_format
() → ARVideoFormat¶ DesiredVideoFormat:
- Returns
- Return type
-
get_enabled_session_tracking_feature
() → ARSessionTrackingFeature¶ EnabledSessionTrackingFeatures:
- Returns
- Return type
-
get_environment_capture_probe_type
() → AREnvironmentCaptureProbeType¶ EnvironmentCaptureProbeType:
- Returns
- Return type
-
get_face_tracking_direction
() → ARFaceTrackingDirection¶ FaceTrackingDirection:
- Returns
- Return type
-
get_face_tracking_update
() → ARFaceTrackingUpdate¶ FaceTrackingUpdate:
- Returns
- Return type
-
get_frame_sync_mode
() → ARFrameSyncMode¶ FrameSyncMode:
- Returns
- Return type
-
get_light_estimation_mode
() → ARLightEstimationMode¶ LightEstimationMode:
- Returns
- Return type
-
get_max_num_simultaneous_images_tracked
() → int32¶ MaxNumSimultaneousImagesTracked:
- Returns
- Return type
int32
-
get_plane_detection_mode
() → ARPlaneDetectionMode¶ PlaneDetectionMode:
- Returns
- Return type
-
get_scene_reconstruction_method
() → ARSceneReconstruction¶ SceneReconstructionMethod:
- Returns
- Return type
-
get_session_type
() → ARSessionType¶ SessionType:
- Returns
- Return type
-
get_world_alignment
() → ARWorldAlignment¶ EARWorldAlignment:
- Returns
- Return type
-
set_candidate_object_list
(candidate_objects) → None¶ CandidateObjects:
- Parameters
candidate_objects (Array(ARCandidateObject)) –
-
set_desired_video_format
(new_format) → None¶ DesiredVideoFormat:
- Parameters
new_format (ARVideoFormat) –
-
set_face_tracking_direction
(direction) → None¶ FaceTrackingDirection:
- Parameters
direction (ARFaceTrackingDirection) –
-
set_face_tracking_update
(update) → None¶ FaceTrackingUpdate:
- Parameters
update (ARFaceTrackingUpdate) –
-
set_scene_reconstruction_method
(scene_reconstruction_method) → None¶ SceneReconstructionMethod:
- Parameters
scene_reconstruction_method (ARSceneReconstruction) –
-
set_session_tracking_feature_to_enable
(session_tracking_feature) → None¶ EnabledSessionTrackingFeatures:
- Parameters
session_tracking_feature (ARSessionTrackingFeature) –