Please Select Your Location
Australia
Österreich
België
Canada
Canada - Français
中国
Česká republika
Denmark
Deutschland
France
HongKong
Iceland
Ireland
Italia
日本
Korea
Latvija
Lietuva
Lëtzebuerg
Malta
المملكة العربية السعودية (Arabic)
Nederland
New Zealand
Norge
Polska
Portugal
Russia
Saudi Arabia
Southeast Asia
España
Suisse
Suomi
Sverige
台灣
Ukraine
United Kingdom
United States
Please Select Your Location
België
Česká republika
Denmark
Iceland
Ireland
Italia
Latvija
Lietuva
Lëtzebuerg
Malta
Nederland
Norge
Polska
Portugal
España
Suisse
Suomi
Sverige

Integrate Facial Tracking with Your Avatar

Overview

  1. OpenXR Facial Tracking Plugin Setup.

1.1 Enable OpenXR Plugins.

1.2 Enable Facial Tracking extensions.

  1. Game objects settings.

2.1 Avatar object settings.

  1. Scripts for your avatar.

3.1 Add enum types for OpenXR blendshapes and your avatar blendshapes.

3.2 Add scripts for eye tracking.

3.2.1 Add scripts to Start function.
3.2.2 Add scripts to Update function.
3.2.3 Add scripts to OnDestroy function:
3.2.4 Eye tracking result:
3.3 Add scripts for lip tracking.
3.3.1 Add scripts to Start function.
3.3.2 Add scripts to Update function.
3.3.3 Add scripts to OnDestroy function.
3.3.4 Lip tracking result
  1. Tips.

1. OpenXR Facial Tracking Plugin Setup

Supported Unity Engine version: 2020.2+

※ Note : Before you start, please install or update
the latest VIVE Software from SteamVR or OOBE , and check your SR_Runtime

1.1 Enable OpenXR Plugins:

Please enable OpenXR plugin in Edit > Project Settings > XR Plug-in Management:
image1.png

Click Exclamation mark next to “OpenXR” then choose “Fix All”.
image2.png
image3.png

Add Interaction profiles for your device. (As follows, taking Vive Controller as an example.)

image4.png

1.2 Enable Facial Tracking extensions

image5.png

2. Game Objects Settings

2.1 Avatar Object Settings

Import your avatar (Use the avatar provided by our FacialTracking sample as an example).

Assets > Samples > VIVE Wave OpenXR Plugin - Windows > {version} > FacialTracking Example > ViveSR > Models > version2 > Avatar_Shieh_V2.fbx

image6.png

Note : It is recommended to negate the z scale value so that the avatar is consistent with the user's left and right direction.

image7.png image8.png

3. Scripts for your avatar

3.1 Add enum types for OpenXR blend shapes and your avatar blend shapes

Create new script which declares corresponding enum types as follows.

Your avatar blend shapes.


image9.png

image10.png image11.png

3.2 Add scripts for eye tracking

Create new script for eye tracking and add following namespaces to your script.

using VIVE.FacialTracking;
using System;
using System.Runtime.InteropServices;

Add the following properties:

//Map OpenXR eye shapes to avatar blend shapes
private static Dictionary < XrEyeShapeHTC, SkinnedMeshRendererShape > ShapeMap;
public SkinnedMeshRenderer HeadskinnedMeshRenderer;
private FacialManager facialmanager;
private Dictionary<XrEyeShapeHTC,float> EyeWeightings = new Dictionary<XrEyeShapeHTC, float>();

Attach it to your avatar and set options in inspector.

image11.png image12.png

3.2.1 Add scripts to Start function

Step 1: Set the mapping relations between OpenXR eye shapes and avatar blend shapes.

ShapeMap = new Dictionary < XrEyeShapeHTC, SkinnedMeshRendererShape > ();
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_BLINK_HTC, SkinnedMeshRendererShape.Eye_Left_Blink);
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_WIDE_HTC, SkinnedMeshRendererShape.Eye_Left_Wide);
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_BLINK_HTC, SkinnedMeshRendererShape.Eye_Right_Blink);
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_WIDE_HTC, SkinnedMeshRendererShape.Eye_Right_Wide );
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_SQUEEZE_HTC, SkinnedMeshRendererShape.Eye_Left_Squeeze );
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_SQUEEZE_HTC,SkinnedMeshRendererShape.Eye_Right_Squeeze );
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_DOWN_HTC, SkinnedMeshRendererShape.Eye_Left_Down);
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_DOWN_HTC,SkinnedMeshRendererShape.Eye_Right_Down);
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_OUT_HTC,SkinnedMeshRendererShape.Eye_Left_Left );
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_IN_HTC,SkinnedMeshRendererShape.Eye_Right_Left );
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_IN_HTC, SkinnedMeshRendererShape.Eye_Left_Right );
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_OUT_HTC, SkinnedMeshRendererShape.Eye_Right_Right );
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_UP_HTC, SkinnedMeshRendererShape.Eye_Left_Up );
ShapeMap.Add(XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_UP_HTC ,SkinnedMeshRendererShape.Eye_Right_Up );

Step 2: Start eye tracking detection:

facialmanager.StartFramework(XrFacialTrackingTypeHTC.XR_FACIAL_TRACKING_TYPE_EYE_DEFAULT_HTC);


3.2.2 Add scripts to Update function

Step 1: Get eye tracking detection results:

facialmanager.GetWeightings(out EyeWeightings);

Step 2: Update avatar blend shapes

for (XrEyeShapeHTC i = XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_BLINK_HTC; i < XrEyeShapeHTC.XR_EYE_EXPRESSION_MAX_ENUM_HTC; i++)
{
    HeadskinnedMeshRenderer.SetBlendShapeWeight((int)ShapeMap[i], EyeWeightings[i] * 100f);
}


3.2.3 Add scripts to OnDestroy function:

Stop eye tracking detection:

facialmanager.StopFramework(XrFacialTrackingTypeHTC.XR_FACIAL_TRACKING_TYPE_EYE_DEFAULT_HTC);

3.2.4 Eye tracking result:

image13.png


3.3.0 Add scripts for lip tracking

Create new script for lip tracking and add following namespaces to your script.

using VIVE.FacialTracking;
using System;
using System.Runtime.InteropServices;

Add the following properties:

private FacialManager facialmanager =  new FacialManager();
private Dictionary < XrLipShapeHTC,float > LipWeightings = new Dictionary<XrLipShapeHTC, float>();
public SkinnedMeshRenderer HeadskinnedMeshRenderer;
//Map OpenXR lip shape to avatar lip blend shape
private static Dictionary<XrLipShapeHTC, SkinnedMeshRendererShape> ShapeMap;

Attach it to your avatar and set options in inspector.

image11.png image14.png


3.3.1 Add scripts to Start function

Step 1: Set the mapping relations between OpenXR lip shapes and avatar blend shapes.

ShapeMap = new Dictionary < XrLipShapeHTC, SkinnedMeshRendererShap> ();
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_JAW_RIGHT_HTC, SkinnedMeshRendererShape.Jaw_Right);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_JAW_LEFT_HTC, SkinnedMeshRendererShape.Jaw_Left);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_JAW_FORWARD_HTC, SkinnedMeshRendererShape.Jaw_Forward);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_JAW_OPEN_HTC, SkinnedMeshRendererShape.Jaw_Open);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_APE_SHAPE_HTC, SkinnedMeshRendererShape.Mouth_Ape_Shape);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_UPPER_RIGHT_HTC, SkinnedMeshRendererShape.Mouth_Upper_Right);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_UPPER_LEFT_HTC, SkinnedMeshRendererShape.Mouth_Upper_Left);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_LOWER_RIGHT_HTC, SkinnedMeshRendererShape.Mouth_Lower_Right);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_LOWER_LEFT_HTC, SkinnedMeshRendererShape.Mouth_Lower_Left);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_UPPER_OVERTURN_HTC, SkinnedMeshRendererShape.Mouth_Upper_Overturn);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_LOWER_OVERTURN_HTC, SkinnedMeshRendererShape.Mouth_Lower_Overturn);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_POUT_HTC, SkinnedMeshRendererShape.Mouth_Pout);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_SMILE_RIGHT_HTC, SkinnedMeshRendererShape.Mouth_Smile_Right);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_SMILE_LEFT_HTC, SkinnedMeshRendererShape.Mouth_Smile_Left);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_SAD_RIGHT_HTC, SkinnedMeshRendererShape.Mouth_Sad_Right);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_SAD_LEFT_HTC, SkinnedMeshRendererShape.Mouth_Sad_Left);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_CHEEK_PUFF_RIGHT_HTC, SkinnedMeshRendererShape.Cheek_Puff_Right);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_CHEEK_PUFF_LEFT_HTC, SkinnedMeshRendererShape.Cheek_Puff_Left);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_CHEEK_SUCK_HTC, SkinnedMeshRendererShape.Cheek_Suck);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_UPPER_UPRIGHT_HTC, SkinnedMeshRendererShape.Mouth_Upper_UpRight);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_UPPER_UPLEFT_HTC, SkinnedMeshRendererShape.Mouth_Upper_UpLeft);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_LOWER_DOWNRIGHT_HTC, SkinnedMeshRendererShape.Mouth_Lower_DownRight);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_LOWER_DOWNLEFT_HTC, SkinnedMeshRendererShape.Mouth_Lower_DownLeft);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_UPPER_INSIDE_HTC, SkinnedMeshRendererShape.Mouth_Upper_Inside);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_LOWER_INSIDE_HTC, SkinnedMeshRendererShape.Mouth_Lower_Inside);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_MOUTH_LOWER_OVERLAY_HTC, SkinnedMeshRendererShape.Mouth_Lower_Overlay);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_TONGUE_LONGSTEP1_HTC, SkinnedMeshRendererShape.Tongue_LongStep1);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_TONGUE_LEFT_HTC, SkinnedMeshRendererShape.Tongue_Left);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_TONGUE_RIGHT_HTC, SkinnedMeshRendererShape.Tongue_Right);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_TONGUE_UP_HTC, SkinnedMeshRendererShape.Tongue_Up);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_TONGUE_DOWN_HTC, SkinnedMeshRendererShape.Tongue_Down);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_TONGUE_ROLL_HTC, SkinnedMeshRendererShape.Tongue_Roll);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_TONGUE_LONGSTEP2_HTC, SkinnedMeshRendererShape.Tongue_LongStep2);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_TONGUE_UPRIGHT_MORPH_HTC, SkinnedMeshRendererShape.Tongue_UpRight_Morph);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_TONGUE_UPLEFT_MORPH_HTC, SkinnedMeshRendererShape.Tongue_UpLeft_Morph);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_TONGUE_DOWNRIGHT_MORPH_HTC, SkinnedMeshRendererShape.Tongue_DownRight_Morph);
ShapeMap.Add(XrLipShapeHTC.XR_LIP_SHAPE_TONGUE_DOWNLEFT_MORPH_HTC, SkinnedMeshRendererShape.Tongue_DownLeft_Morph);

Step 2: Start lip tracking detection:

facialmanager.StartFramework(XrFacialTrackingTypeHTC.XR_FACIAL_TRACKING_TYPE_LIP_DEFAULT_HTC);


3.3.2 Add scripts to Update function

Step 1: Get lip tracking detection results:

facialmanager.GetWeightings(out LipWeightings);

Step 2: Update avatar blend shapes


  for (XrLipShapeHTC i = XrLipShapeHTC.XR_LIP_SHAPE_JAW_RIGHT_HTC; i > XrLipShapeHTC.XR_LIP_SHAPE_MAX_ENUM_HTC; i++)
  {
      HeadskinnedMeshRenderer.SetBlendShapeWeight((int)ShapeMap[i], LipWeightings[i] * 100f);
  }
  



3.3.3 Add scripts to OnDestroy function

Stop lip tracking detection:

facialmanager.StopFramework(XrFacialTrackingTypeHTC.XR_FACIAL_TRACKING_TYPE_LIP_DEFAULT_HTC);



3.3.4 Lip tracking result:

image16.png

4. Tips

Tip 1: You can perform eye tracking and lip tracking on avatar at the same time to achieve facial tracking as shown below.
image17.png

Tip 2: To let avatar be more vivid, you could roughly infer the position of the left pupil and update avatar by following blend shapes. The similar way can also apply to right pupil.

XR_EYE_EXPRESSION_LEFT_UP_HTC

XR_EYE_EXPRESSION_LEFT_DOWN_HTC

XR_EYE_EXPRESSION_LEFT_IN_HTC

XR_EYE_EXPRESSION_LEFT_OUT_HTC

For example: add below code to script of eye tracking.

  1. Add the following properties:
    public GameObject leftEye;
    public GameObject rightEye;
    private GameObject[] EyeAnchors;
    
  2. Set options in inspector.
image18.png image19.png
3. Add lines in Start function to create anchors for left and right eyes.
EyeAnchors = new GameObject[2];
EyeAnchors[0] = new GameObject();
EyeAnchors[0].name = "EyeAnchor_" + 0;
EyeAnchors[0].transform.SetParent(gameObject.transform);
EyeAnchors[0].transform.localPosition = leftEye.transform.localPosition;
EyeAnchors[0].transform.localRotation = leftEye.transform.localRotation;
EyeAnchors[0].transform.localScale = leftEye.transform.localScale;
EyeAnchors[1] = new GameObject();
EyeAnchors[1].name = "EyeAnchor_" + 1;
EyeAnchors[1].transform.SetParent(gameObject.transform);
EyeAnchors[1].transform.localPosition = rightEye.transform.localPosition;
EyeAnchors[1].transform.localRotation = rightEye.transform.localRotation;
EyeAnchors[1].transform.localScale = rightEye.transform.localScale;

4. Add lines in Update function to calculate eye gaze direction and update eye rotation.
Vector3 GazeDirectionCombinedLocal = Vector3.zero;
if (EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_IN_HTC] > EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_OUT_HTC])
{
    GazeDirectionCombinedLocal.x = -EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_IN_HTC];
}
else
{
    GazeDirectionCombinedLocal.x = EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_OUT_HTC];
}
if (EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_UP_HTC] > EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_WIDE_HTC])
{
    GazeDirectionCombinedLocal.y = EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_UP_HTC];
}
else
{
    GazeDirectionCombinedLocal.y = -EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_LEFT_WIDE_HTC];
}
GazeDirectionCombinedLocal.z = (float)-1.0;
Vector3 target = EyeAnchors[0].transform.TransformPoint(GazeDirectionCombinedLocal);
leftEye.transform.LookAt(target);

if (EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_IN_HTC] > EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_UP_HTC])
{
    GazeDirectionCombinedLocal.x = EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_IN_HTC];
}
else
{
    GazeDirectionCombinedLocal.x = -EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_UP_HTC];
}
if (EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_UP_HTC] > EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_DOWN_HTC])
{
    GazeDirectionCombinedLocal.y = -EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_UP_HTC];
}
else
{
    GazeDirectionCombinedLocal.y = EyeWeightings[XrEyeShapeHTC.XR_EYE_EXPRESSION_RIGHT_DOWN_HTC];
}
GazeDirectionCombinedLocal.z = (float)-1.0;
target = EyeAnchors[1].transform.TransformPoint(GazeDirectionCombinedLocal);
rightEye.transform.LookAt(target);
5. Results

image20.png