This project is the engineering part of the first lesson of the HoloKit tutorial. It covers topics such as how to use HoloKit stereo rendering, how to place AR objects, and interact with them.
Mofo is an augmented reality experience and one of the case studies in the HoloKit tutorial. It enables users to create a particle-style Buddha at any location, triggering interactive effects with hand gestures. Leveraging the HoloKit SDK, this experience supports stereo rendering, allowing users to enhance their experience with the accompanying HoloKit hardware.
After building this project as an iPhone app, first, scan surfaces(floor) around your, tap on screen to place a particle-style Buddha.
Utilizing the hand-tracking feature from the HoloKit SDK, this enables players to interact with the Buddha using their hands.
Additionally, the app offers stereo rendering options for an enhanced mixed reality experience using HoloKit hardware.
The HoloKit tutorial aims to provide a series of engaging and hands-on courses, gradually exploring the AR development process and related technologies on the Unity platform. It also guides users on how to utilize the HoloKit SDK effectively.
In the following linked article, we build this application step by step from scratch. If you have the need to learn or reference, please visit:
This project utilizes the HoloKit SDK and aims to build an app that runs on iOS devices.
- Unity 2022.3.8f1
- Xcode 14.2
- iPhone with Lidar capability
- Clone the project, open with Unity
- Open scene in path: Assets->Scene->Buddha_PlacingWithTouch
- Build to an Xcode project, go to: File -> Build Settings -> Build, to build this scene to an Xcode project
- Open Xcode, then open the project
- Click build button to build app to your device
- Open the app, scan on a flat surface(floor)
- Tap on screen to create a Buddha
- Get close and use hand to interact with Buddha
- Click on “Stereo” button to switch rendering mode