LLV enables you to record and play back live link frames, sent by Epic Games' ARKit face capture iOS app.
If you appreciate LLV and like me to adjust it for tooling within your project pipeline, please reach out via llv(at)think-biq.com to discuss options. I'd love to add you as official sponsor.
Checkout the wiki for more information on creating issues.
Checkout this video, on how to setup LLV to be used with the MetaHuman example project.
Listens for 256 incoming frames on all interfaces and standard port 11111 and writes the recording to a file named dao.gesichter.
python llv.py record --frames 256 --output dao.gesichter
Play one of the example recordings and send it to a host machine at 10.0.0.69 with implicit standard port of 11111 and 60 frames per seconds.
python llv.py play --host 10.0.0.69 examples/dao.gesichter
Recordings are stored as lines of base64 encoded frames. You can unpack recording files, to create a cleartext version, letting you inspect the frames as a json array. If you'd like to create your own frames by hand or script, you can pack it for the use with LLV.
python llv.py unpack examples/dao.gesichter dao.klare-gesichter
python llv.py pack dao.klare-gesichter dao.gesichter
The packet sizes of a frame are defined in the engine code as:
// PacketVersion FrameTime BlendShapeCount Blendshapes SubjectName DeviceID
const uint32 MAX_BLEND_SHAPE_PACKET_SIZE = sizeof(BLEND_SHAPE_PACKET_VER) + sizeof(FQualifiedFrameTime) + sizeof(uint8) + (sizeof(float) * (uint64)EARFaceBlendShape::MAX) + (sizeof(TCHAR) * 256) + (sizeof(TCHAR) * 256);
const uint32 MIN_BLEND_SHAPE_PACKET_SIZE = sizeof(BLEND_SHAPE_PACKET_VER) + sizeof(FQualifiedFrameTime) + sizeof(uint8) + (sizeof(float) * (uint64)EARFaceBlendShape::MAX) + sizeof(TCHAR) + sizeof(TCHAR);
This results in the minimum frame size being 264 bytes and the maximum being 774 bytes.
The layout is defined as:
- PacketVersion -> 1 byte (uint8_t)
- FrameTime -> 16 bytes (int32 + float + int32 + int32)
- BlendShapeCount -> 1 byte (uint8_t)
- List of blenshape values -> Blendshape Count * 4 bytes (float)
- Subject Name -> Name Length * 1 byte (char)
- Device ID -> ID Length * 1 byte (char)
There are a maximum of 61 blendshapes supported. See the apple ARKit docs for more info.