Please Select Your Location
Australia
Österreich
België
Canada
Canada - Français
中国
Česká republika
Denmark
Deutschland
France
HongKong
Iceland
Ireland
Italia
日本
Korea
Latvija
Lietuva
Lëtzebuerg
Malta
المملكة العربية السعودية (Arabic)
Nederland
New Zealand
Norge
Polska
Portugal
Russia
Saudi Arabia
Southeast Asia
España
Suisse
Suomi
Sverige
台灣
Ukraine
United Kingdom
United States
Please Select Your Location
België
Česká republika
Denmark
Iceland
Ireland
Italia
Latvija
Lietuva
Lëtzebuerg
Malta
Nederland
Norge
Polska
Portugal
España
Suisse
Suomi
Sverige

Interact with the Real World: OpenXR Scene Understanding

OpenXR Scene Understanding Plugin Setup

Supported Unreal Engine version: 4.27 +

  • Enable Plugins:
    • Please enable plugin in Edit > Plugins > Virtual Reality:
image1.png
  • Disable Plugins:
    • The "Steam VR" plugin must be disabled for OpenXR to work.
    • Please disable plugin in Edit > Plugins > Virtual Reality:
image2.png
    • Project Settings:
      • Please make sure the “ OpenXR Scene Understanding extension ” is enabled, the setting is in Edit > Project Settings > Plugins > Vive OpenXR > Scene Understanding:
image3.png

Introduction to Blueprint Nodes for OpenXR Scene Understanding

  • Used to limit the range that will be scanned
    • These bounding volumes are used to determine which scene components to include in the resulting scene.
      • Set Scene Compute Oriented Box Bound : Set the bounds of the box shape.
      • Set Scene Compute Sphere Bound : Set the bounds of the sphere shape.
      • Set Scene Compute Frustum Bound : Set the bounds of the frustum shape.
      • Clear Scene Compute Bounds: Before the game ends, you need to clear all bounds by calling Clear Scene Compute Bounds .

image4.png

  • Controls the update speed of the scene and will affect the quality of the scene result
    • This function will be trading off speed against the quality of the resulting scene.
    • Set Scene Compute Consistency
image5.png
  • Identify the level of detail of meshes
    • This function will identify the level of detail of visual mesh compute.
    • Set Mesh Compute Lod
image6.png

Show Scene Understanding scanned mesh in the game

  • Step1. Essential Setup
      • Creating an AR Session:
  • The AR-related content needs to be set here because the OpenXR Scene Understanding uses OpenXRARTrackedGeometry to make the scanned meshes appear in the level. Unreal Setting up AR project Tutorial
  • Create Data Asset: Content Browser > All > Content > click right mouse button > choose Miscellaneous > Data Asset

image7.png

    • After choose Data Asset , it will pop out “ Pick Class For Data Asset Instance ” window:
      • Choose “ ARSessionConfig ” and press “ Select ”.
image8.png
    • Open ARSessionConfig and enable some settings under Details Panel > AR Settiongs > World Mapping
      • Enable Generate Mesh Data from Tracked Geometry
      • Enable Generate Collision for Mesh Data
image9.png

  • Open Level Blueprint and add Start AR Session and Stop AR Session
    • Remember to set the “ARSessionConfig” as input to Start AR Session.
image11.png
  • Create a Blueprint for Scene Understanding and drag into the level.
image12.png

image13.png

Step2. BP_SceneUnderstanding: Display scanned mesh in the level
  • In this tutorial we choose Actor as the parent class.
  • In Components panel:
    • Add a Cube Static Mesh Component for Set Scene Compute Oriented Box Bounds .
image14.png image14.png

  • Add ARTrackableNotify Component in Components panel.


image16.png

    • Add a new variable and name it “ spatialMeshes ” with type “ ARTrackedGeometry Array ” in My Blueprint
image17.png image18.png

  • Add a function and name it “ StartSpatialMapping ” in My Blueprint In this function, we need to call “ Toggle ARCapture ” function and set input “ On Off ” bool to true and “ Capture Type” to Spatial Mapping.
image18.png
image20.png
    • Add a function and name it “ StopSpatialMapping ” in My Blueprint In this function, we need to call “ Toggle ARCapture ” function and set input “ On Off ” bool to false and “ Capture Type” to Spatial Mapping .
image20.png image21.png

    • Event BeginPlay : We will add some functions that affect the quality and area of the scanned mesh.
image22.png
    • On Add/Remove Tracked Geometry (ARTrackableNotify):
      • These two Events will control the number of mesh currently scanned.

image23.png

  • Event Tick: You can get the current Spatial Meshes and Set Material on them.
image24.png
  • Event BeginPlay: We will need to call Stop Spatial Mapping function and clear the bound you use at the beginning.
image25.png


Result

1.png
2.png

Note:

  • White Area: meshes representing the real-environment objects.
  • Black Area: no objects in the space.

Spawn balls and interact with generated meshes

This sample is currently only supported Unreal Engine version: 5.0+ .

Because the physics system used by the meshes generated is Chaos Physics .

  • Create a Blueprint
    • Create a Blueprint for the “Spawn ball” and drag it into the level.
    • You can place as many as you want in the level.
image28.png image29.png

  • BP_SpawnBall: Make a sphere static mesh to simulate a physical fall.
    • In Components panel: Add a Sphere Static Mesh Component to pretend the ball in Components panel.
image30.png

  • In Details panel: Enable Simulate Physics and make sure the Collision Presets is “PhysicsActor”
image31.png
  • In My Blueprint panel:
    • Add a variable to hold the origin location of the ball.
image32.png
    • In Event Graph panel:
      • Set the origin location of the ball when the game starts.
image33.png
      • Track the height of the ball in each frame, and reset its position if it is below -600.
image34.png

Result

3.png

Note:

  • Red Balls: Virtual Objects.
  • White Area: It means that there are scanned meshes .
  • Black Area: It means that there are no objects in the space.