-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Spike] Define MR library to use #587
Comments
Meta Horizon OS Jandig Version ViabilityJandig, as a platform for augmented reality (AR) experiences, fundamentally relies on access to a camera feed with sufficient resolution. Unfortunately, Meta currently does not allow developers direct access to the raw camera output on its headsets. Instead, Meta offers a "camera passthrough feature," where the headset processes the raw image internally and provides developers with a filtered output—a rendered virtual environment. This allows users to see their physical surroundings while wearing the headset, but it prevents our platform from utilizing object tracking or image detection, which are essential for our current approach to AR. On September 25, 2024, Meta announced that it plans to release an enhanced passthrough API sometime this year, featuring object tracking and additional capabilities. While this update could potentially enable the features we need, there is no confirmed release date yet. Workaround StrategySince we cannot rely on pattern or image detection to place virtual content dynamically, we can instead create an experience where users manually set up their exhibition, positioning each artwork within the scene beforehand. However, a key limitation is that users will not see "markers" once they remove the headset. Proposed Experience Flow
DownsideOur current experience is interesting and brings curiosity to our users by bringing a mix of a physical marker and its virtual content. This new approach break this, keeping only the physical surroundings. Unfortunately, this approach breaks part of the mixed experience we have now, with artworks relying upon physical markers. ConclusionEven though this workaround does not provide a fully seamless AR experience, it allows us to start developing for Horizon OS and gain valuable experience with the platform. By the time Meta’s enhanced passthrough feature is released, we will have already laid much of the groundwork, making it easier to integrate object tracking and refine the experience. This marks the beginning of our journey toward making Jandig a key player in Horizon OS's AR ecosystem. References
|
I'm delighted to read this plan. I agree that we start with a workaround to lay the groundwork.
I'm building on your proposal.
### Proposed Experience Flow
**A. In the CMS**
*Since it's similar to our current workflow, I'm not including details.*
1. The **artist** uploads *objects* (2D or 3D, with or without audio) in the CMS.
2. The **curator** creates an **MR exhibition** in the CMS and includes the **objects**.
**B. In a headset, the curator**
*The UX itself needs to be figured out. I'm including step-by-step so we can understand all the elements that need to be created. I'm assuming an exhibition was already visited using the headset before.*
1. Opens the Jandig app. The app opens the last *exhibition* (AR or MR) that was visited.
2. Clicks in the hamburger button. The app shows the list of **MR exhibitions** and, at the bottom, two buttons: *AR exhibitions* and **Login**.
3. Clicks **Login**.
4. Select **Setup Ehxibit**. It shows the list of **MR exhibits** the user created.
5. Select the pre-created *MR exhibition* from a menu. It opens the camera view with all the contents in a matrix 2 meters away from the **curator**. The last items of the matrix are boxes with "**Save**," "**Save and Exit**," and "**Exit Without Saving**."
6. When clicking the trigger when "touching" an object, the curator "holds" it and can move it.
7. Clicks **Save** when they want to save the current positions.
8. Clicks **Save and Exit** or **Exit Without Saving** when finished.
**B. In the headset, the public**
*I'm assuming it's the first time the headset is used to visit exhibition..*
1. Opens the Jandig app. The app shows the list of **MR exhibitions** and, at the bottom, two buttons: *AR exhibitions* and Login.
2. Clicks the **MR exhibition** they want to open. It shows the MR exhibition with the **objects** in the position they were saved.
### Notes
- I used **artist**, **curator**, and **public** so it's easier to understand what's the desired outcome during the interactions (and to make clear the permissions).
- I used CMS to refer to the current CMS web interface we have.
- I used the term **object** to mean that any (current) object can be placed.
- I created the expression **MR exhibition** so we can differentiate what will be shown on smartphones and headsets. The list of MR exhibitions won't be shown in the (smartphone) AR Viewer, but both will be shown in the (headset) MR Viewer.
|
Looks great! We also have another option for the setup experience:
I came up with this experience when I thought if I wanted to setup an exhibition, as an example in my room, but I didn't make the exhibition and I don't have permission to do it, just like how it is today if I want to print the markers of an exhibition and put them in my house or something. |
That's an exciting use case! I'll put it in my own words to ensure I understand it. A curator wants to set up a specific exhibition in a space but has not previously created it. They couldn't do so since they didn't have permission to edit the exhibition. Anyone being able to set up works well when we have a small public visiting the exhibition. But in an exhibition with a large public, someone may think moving the exhibition is part of the experience and change the experience intended by the curator. So, we should keep the required permissions to edit the exhibition and create a Clone button to prevent someone from remixing an MR exhibit to another space. I created a "fork" of step B based on the use case that the curator wants to set up an exhibition created by another curator. B2. In a headset, the curator
What do you think? @Kimberlyrenno, I wonder if it would be better to enable curators to clone in the CMS instead of using the MR editor. Fun fact: I consider the hamburger button on step 2 to be an element that can be positioned when the exhibition is set up. |
We need to update our AR viewer to an MR viewer. This will deprecate AR.js.
Tasks:
The text was updated successfully, but these errors were encountered: