You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With the iPhone X, you can capture a very granular 3d map of someones face from the selfie camera. It should be possible to stream this map in realtime from the camera if a setting for this is on.
The text was updated successfully, but these errors were encountered:
@amfang I think the main issue is that ARKit provides an entirely integrated camera. If we were to use the ARKit view controller methodology, Lumina would lose the ability to control the camera controls itself, or at least that's what I am thinking.
@dokun1 I checked Apple document for Vision at coreML for object detect with AR for Machine-Learning Image Analysis, although can't streamed, but can reach the same effective like Lumina.
With the iPhone X, you can capture a very granular 3d map of someones face from the selfie camera. It should be possible to stream this map in realtime from the camera if a setting for this is on.
The text was updated successfully, but these errors were encountered: