#include "UObject/ObjectMacros.h"
#include "UObject/Object.h"
#include "ARTrackable.h"
#include "ARComponent.h"
#include "Engine/DataAsset.h"
#include "ARSessionConfig.generated.h"
Go to the source code of this file.
|
| enum class | EARWorldAlignment : uint8 { Gravity
, GravityAndHeading
, Camera
} |
| |
| enum class | EARSessionType : uint8 {
None
, Orientation
, World
, Face
,
Image
, ObjectScanning
, PoseTracking
, GeoTracking
} |
| |
| enum class | EARPlaneDetectionMode : uint8 { None = 0
, HorizontalPlaneDetection = 1
, VerticalPlaneDetection = 2
} |
| |
| enum class | EARLightEstimationMode : uint8 { None = 0
, AmbientLightEstimate = 1
, DirectionalLightEstimate = 2
} |
| |
| enum class | EARFrameSyncMode : uint8 { SyncTickWithCameraImage = 0
, SyncTickWithoutCameraImage = 1
} |
| |
| enum class | EAREnvironmentCaptureProbeType : uint8 { None
, Manual
, Automatic
} |
| |
| enum class | EARFaceTrackingUpdate : uint8 { CurvesAndGeo
, CurvesOnly
} |
| |
| enum class | EARSessionTrackingFeature : uint8 {
None
, PoseDetection2D
, PersonSegmentation
, PersonSegmentationWithDepth
,
SceneDepth
, SmoothedSceneDepth
} |
| |
| enum class | EARSceneReconstruction : uint8 { None
, MeshOnly
, MeshWithClassification
} |
| |
◆ UE_API
◆ EAREnvironmentCaptureProbeType
Options for how environment textures are generated. This feature is used by ARKit.
| Enumerator |
|---|
| None | Disables environment texture generation.
|
| Manual | The app specifies where the environment capture probes are located and their size.
|
| Automatic | The AR system automatically creates and places the environment capture probes.
|
◆ EARFaceTrackingUpdate
Options for the kind of data to update during Face Tracking. This feature is used by ARKit.
| Enumerator |
|---|
| CurvesAndGeo | Both curves and geometry are updated. This is useful for mesh visualization.
|
| CurvesOnly | Only the curve data is updated.
|
◆ EARFrameSyncMode
Options for how the Unreal frame synchronizes with camera image updates. This feature is used by ARCore.
| Enumerator |
|---|
| SyncTickWithCameraImage | Unreal tick will be synced with the camera image update rate.
|
| SyncTickWithoutCameraImage | Unreal tick will not related to the camera image update rate.
|
◆ EARLightEstimationMode
Options for how light is estimated based on the camera capture. This feature is used by ARCore and ARKit.
| Enumerator |
|---|
| None | Disables light estimation.
|
| AmbientLightEstimate | Estimates an ambient light.
|
| DirectionalLightEstimate | Estimates a directional light of the environment with an additional key light. Currently not supported.
|
◆ EARPlaneDetectionMode
Options for how flat surfaces are detected. This feature is used by ARCore and ARKit.
| Enumerator |
|---|
| None | Disables plane detection.
|
| HorizontalPlaneDetection | |
| VerticalPlaneDetection | |
◆ EARSceneReconstruction
Options for how the scene should be reconstructed. This feature is used by ARKit.
| Enumerator |
|---|
| None | Disables scene reconstruction
|
| MeshOnly | Creates a mesh approximation of the environment.
|
| MeshWithClassification | Creates a mesh approximation of the environment and classifies the objects constructed.
|
◆ EARSessionTrackingFeature
Options for more tracking features to be enabled for the session, in addition to what is already defined in the project’s @EARSessionType.
| Enumerator |
|---|
| None | No additional features are enabled.
|
| PoseDetection2D | Adds tracking for 2D human poses to the session. This feature is used by ARKit.
|
| PersonSegmentation | Uses person segmentation for occlusion in the session. This feature is used by ARKit.
|
| PersonSegmentationWithDepth | Uses person segmentation with depth information for occlusion in the session. This feature is used by ARKit.
|
| SceneDepth | Uses scene depth for occlusion while tracking in the session. This feature is used by ARCore and ARKit.
|
| SmoothedSceneDepth | Uses smoothed scene depth for occlusion while tracking in the session. This feature is used by ARKit.
|
◆ EARSessionType
Options for the tracking type of the session. All AR platforms use this structure but only some session tracking are supported on each platform. The options are mutually exclusive.
| Enumerator |
|---|
| None | No tracking in the session.
|
| Orientation | A session where only the orientation of the device is tracked. ARKit supports this type of tracking.
|
| World | A session where the position and orientation of the device is tracked relative to objects in the environment. All platforms support this type of tracking.
|
| Face | A session where only faces are tracked. ARKit and ARCore support this type of tracking using the front-facing camera.
|
| Image | A session where only images supplied by the app are tracked. There is no world tracking. ARKit supports this type of tracking.
|
| ObjectScanning | A session where objects are scanned for object detection in a later World Tracking session. ARKit supports this type of tracking.
|
| PoseTracking | A session where human poses in 3D are tracked. ARKit supports this type of tracking using the rear-facing camera.
|
| GeoTracking | A session where geographic locations are tracked. ARKit supports this type of tracking.
|
◆ EARWorldAlignment
Options for how the scene’s coordinate system is constructed. This feature is used by ARKit.
| Enumerator |
|---|
| Gravity | The coordinate system is aligned with gravity, defined by the vector (0, -1, 0). Origin is the initial position of the device.
|
| GravityAndHeading | The coordinate system is aligned with gravity, defined by the vector (0, -1, 0), and compass heading based on True North, defined by the vector (0, 0, -1). Origin is the initial position of the device.
|
| Camera | The coordinate system matches the camera's orientation. This option is recommended for Face AR.
|
◆ ENUM_CLASS_FLAGS()