-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clarify the normalization range for eye gaze #76
base: master
Are you sure you want to change the base?
Conversation
The contents of this PR looks off, could someone else provide a second opinion on this? I could be wrong but from my testing, it appears to be the following:
where EyeLeftX is between -1 and 1 and angleInRadians is between −π/2 and π/2 (meaning -90 to 90 degrees) Testing with a Vive Pro Eye, I've been using the output of EyeLeftX;EyeRightX;EyeY to control a eye tracking menu and the results appear to be accurate: I've also found out while building an animator layer that attempts to animate the eyes in the most accurate way I can, that neither Quaternion interpolation nor Euler interpolation seemed to produce accurate results when animating a transform rotation in a blend tree (I cannot speak for muscle values), so, if the objective is to achieve maximum accuracy, I've personally had to:
et-layout.mp4In the video above, the white circle is controlled and displayed by a SteamVR Overlay using the output of VRCFaceTracking EyeLeftX;EyeRightX;EyeY parameters, and the pink laser is controlled by the aforementioned animator. |
I wonder if the driver matters, im using quest pro and have to do 45 degrees for it to work
|
The direct mapping is from SRanipal and I cannot for the life of me find their SDK anymore that details what their normalization function did. |
Hi, I can clarify the exact mapping of the eye gaze. The eye gaze is based off of Tobii's normalized cartesian measurement which maps a measurement in radians to a normalized (-1 to 1) X/Y cartesian. To get the actual degrees value from this cartesian, you need to use the following function: The mapping readouts from SRanipal max out around ~-30 to ~30 degrees, so they never exceed ~0.68 on the coordinate system. Furthermore the maximum theoretical value of ~-45 to ~45 should still map on the coordinate system (All Quest Pro modules map their quaternions to normalize to the coordinate system). |
Honestly I think a better fix for this is just providing a new set of parameters that are normalized in degrees instead of this not very useful normalization that is currently provided |
seems like this may partly exist already but reading this array setup is hurting my head, the comment also makes it seem like this may not occur when the other parameters exist. Also the parameter isnt documented it also doesnt seem normalized |
The eye parameters are just not just normalized linearly, Are you wanting a set of parameters that are linearly normalized? |
These are only for native eye tracking, these are not user accessible directly (ex.: You would essentially be working with the eye bones themselves if you wanted to have any interaction in Unity/VRChat). |
Ideally yes because them being mapped non linearly makes making animations based off of of them difficult since you have to bake in the atan curve into a animation |
made a mockup PR for this benaclejames/VRCFaceTracking#263 Cant figure out how to build the project yet as the project just errors when I do so |
Took me a bit to find this info and then apply it, it should be in the docs