Stream face animation from your IPhone / IPad into the Unreal Engine and let your 3D models get alive.
The Apple ARKit provides face tracking functionality on IOS devices with FaceID support. 51 blendshape values (relative movement values between 0->1) can be tapped and used for animating your own 3D faces.
Reference: https://developer.apple.com/documentation/arkit/tracking_and_visualizing_faces
On the other hand Unreal Engine provides a LiveLink interface, already prepared for receiving those ARKit blendshape values over UDP network protocol. And Unreal Engine includes ready to use controls and blueprint elements for integrating the values into your animation.
Reference: https://docs.unrealengine.com/en-US/Engine/Animation/LiveLinkPlugin/index.html
In this version 1.0 only the 51 known face blendshapes are transmitted. The Unreal Engine LiveLink API for ARKit supports 9 additional blendshape values, which express the relative transformation of the head. These currently not implemented blendshapes will be introduced in the next version.
Prerequesites:
- IOS device (IPhone / IPad) with FaceID support
- Unreal Engine 4.24 (LiveLink API version 5)
- Plugins: LiveLink, Apple ARKit, Apple ARKitFaceTracking
Usage:
- Launch Unreal Engine
- Open Window->LiveLink in Unreal Engine
- Launch LiveLink UE MoCap on your IOS device
- Open settings, and insert the IP of the computer running Unreal Engine
- Enable the connect switch.
Now you should see your Phone in the Unreal Engines LiveLink window.