-
-
Notifications
You must be signed in to change notification settings - Fork 21.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for the OpenXR Eye gaze interaction extension #77989
Add support for the OpenXR Eye gaze interaction extension #77989
Conversation
7f9b019
to
145c56d
Compare
I don't know that any browser actually implements it, but in WebXR eye gaze is also represented as a controller-ish thing. Although, I don't think we'd currently expose it to Godot as a controller - it looks like it'd be an Does it make sense to add a new tracker type like I just want to make sure WebXR and OpenXR end up doing as similar things as possible :-) |
@dsnopek with openXR it just becomes part of the action system so we don't really indifferentiate between types. In that way the node being called XRController3D and the positional tracker having a controller type is kind of misleading, but seeing its the 99% use case it helps in people doing the normal stuff. |
Ok, just for future reference a bit of background information. This PR in theory is ready, but we're having some issues testing due to lack of capable devices. I'm working with Meta and Bytedance to see if we can get to a point we can prove this works correctly. As we're in feature freeze atm there is no rush, this is a 4.2 feature. Anyway, the problem around this extension is that it's generally only available on "Pro" versions of headsets, i.e. Quest Pro, PICO 4 Pro, etc. There is currently an ongoing discussion on how to deal with the scenario where an OpenXR runtime (the software written by the HMD vendor) supports an extension, but it is only available on some devices. So both Quest and PICO runtimes report this extension as available because the runtime implements it, but it will only work on the "Pro" versions of the respective headsets. There is some debate on how the runtime should react on devices that do not support the feature. With this extension we use Off course we can't merge this PR until the above is resolved and headsets have been patched or Godot will stop working on effected headsets. To be continued... |
modules/openxr/openxr_interface.h
Outdated
XR_EYE_GAZE_LIMITED, // The eye gaze interaction is available in limited form, | ||
XR_EYE_GAZE_SUPPORTED, // The eye gaze interaction is supported |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's rename these two to explicitly refer to the interaction:
XR_EYE_GAZE_INTERACTION_LIMITED
XR_EYE_GAZE_INTERACTION_SUPPORTED
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm thinking of changing this all together. The specification language is a bit unclear and misleading.
From what I understand the extra structure we can populate during session creation determines whether gaze interaction is available, not whether its full features.
Basically the extension being available means the runtime supports it, the struct queries whether the device in use actually supports it. So for instance, the Meta XR runtime supports eye tracking so the extension will always be available, but only the Quest Pro actually has an eye tracker so only that will return that the gaze is actually supported.
So the outcome of this should be a simple Yes/No, we can use the eye gaze interaction if the extension is available AND we get true on the supported flag.
Hi, we've recently tried out this PR with the Quest Pro. The following steps needed to be taken to make it work on our end:
Then, eye tracking worked flawlessly. (cc @BennytheBomb) [1]
[2] package com.godot.game;
import org.godotengine.godot.FullScreenGodotApp;
import android.content.pm.PackageManager;
import android.os.Bundle;
/**
Template activity for Godot Android builds.
Feel free to extend and modify this class for your custom logic.
*/
public class GodotApp extends FullScreenGodotApp {
private static final String PERMISSION_EYE_TRACKING = "com.oculus.permission.EYE_TRACKING";
private static final int REQUEST_CODE_PERMISSION_EYE_TRACKING = 1;
@Override
public void onCreate(Bundle savedInstanceState) {
setTheme(R.style.GodotAppMainTheme);
super.onCreate(savedInstanceState);
requestEyeTrackingPermissionIfNeeded();
}
private void requestEyeTrackingPermissionIfNeeded() {
if (checkSelfPermission(PERMISSION_EYE_TRACKING) != PackageManager.PERMISSION_GRANTED) {
requestPermissions(new String[] {PERMISSION_EYE_TRACKING}, REQUEST_CODE_PERMISSION_EYE_TRACKING);
}
}
} [3] https://developer.oculus.com/documentation/native/android/move-eye-tracking/ |
Thanks @tom95 , amazing to hear its working on an actual device. We're having some issues with PICO atm that will prevent merging until they fix an issue on their end (or we'll break PICO support). @m4gr3d we'll need to look into how we're going to embed the permission, and look further into what permissions other vendors come up with. Seeing this is a core extension this should really be a core permission. |
145c56d
to
ad2beb3
Compare
Feedback from PICO is that we need to add |
@BastiaanOlij, should this permission be added to the plugin refactor? |
Yeah I think thats worth doing, still need to think about how we're going to implement asking for the correct permission and checking in the module whether permission has been given |
That's where the feature tags come into play. There are two scenarios:
In both scenarios the extension is gated behind the feature tag check using custom features and We may need to add logic to so the plugin can request extensions to be enabled on demand after startup. |
Superseded by #82614. |
With eye tracking becoming a more prevalent feature, it's time we add support for this.
This PR adds support for the core OpenXR eye gaze extension.
This is mostly untested at the moment:
This PR will remain in draft until the above are resolved.
In order to use this feature a new interaction profile needs to be added to the action map:
Than in your XR setup a new
XRController3D
node can be added like so:This node will be positioned correctly and "point" in a direction where the user is looking. For HMD based eye tracking the location will be centred between the eyes of the user.
As an example a ray cast was added in the above screenshot, this could be used to detect the object the user is looking at.
Todos: