Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clarify the normalization range for eye gaze #76

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

Happyrobot33
Copy link

Took me a bit to find this info and then apply it, it should be in the docs

@hai-vr
Copy link
Contributor

hai-vr commented Feb 28, 2025

The contents of this PR looks off, could someone else provide a second opinion on this?

I could be wrong but from my testing, it appears to be the following:

angleInRadians <- arcsin(EyeLeftX)

where EyeLeftX is between -1 and 1 and angleInRadians is between −π/2 and π/2 (meaning -90 to 90 degrees)

Testing with a Vive Pro Eye, I've been using the output of EyeLeftX;EyeRightX;EyeY to control a eye tracking menu and the results appear to be accurate:

I've also found out while building an animator layer that attempts to animate the eyes in the most accurate way I can, that neither Quaternion interpolation nor Euler interpolation seemed to produce accurate results when animating a transform rotation in a blend tree (I cannot speak for muscle values), so, if the objective is to achieve maximum accuracy, I've personally had to:

et-layout.mp4

In the video above, the white circle is controlled and displayed by a SteamVR Overlay using the output of VRCFaceTracking EyeLeftX;EyeRightX;EyeY parameters, and the pink laser is controlled by the aforementioned animator.

@Happyrobot33
Copy link
Author

I wonder if the driver matters, im using quest pro and have to do 45 degrees for it to work

The contents of this PR looks off, could someone else provide a second opinion on this?

I could be wrong but from my testing, it appears to be the following:

angleInRadians <= arcsin(EyeLeftX)

where EyeLeftX is between -1 and 1 and angleInRadians is between −π/2 and π/2 (meaning -90 to 90 degrees)

Testing with a Vive Pro Eye, I've been using the output of EyeLeftX;EyeRightX;EyeY to control a eye tracking menu and the results appear to be accurate:

I've also found out while building an animator layer that attempts to animate the eyes in the most accurate way I can, that neither Quaternion interpolation nor Euler interpolation seemed to produce accurate results when animating a transform rotation in a blend tree (I cannot speak for muscle values), so, if the objective is to achieve maximum accuracy, I've personally had to:

et-layout.mp4

In the video above, the white circle is controlled and displayed by a SteamVR Overlay using the output of VRCFaceTracking EyeLeftX;EyeRightX;EyeY parameters, and the pink laser is controlled by the aforementioned animator.

@hai-vr
Copy link
Contributor

hai-vr commented Feb 28, 2025

I wonder if the driver matters, im using quest pro and have to do 45 degrees for it to work

I could be wrong but it might be because arcsin(EyeX) and EyeX * 60deg are empirically close for reasonable values of EyeX (???)

image

@kusomaigo
Copy link
Contributor

The direct mapping is from SRanipal and I cannot for the life of me find their SDK anymore that details what their normalization function did.

@regzo2
Copy link
Contributor

regzo2 commented Mar 3, 2025

Hi, I can clarify the exact mapping of the eye gaze. The eye gaze is based off of Tobii's normalized cartesian measurement which maps a measurement in radians to a normalized (-1 to 1) X/Y cartesian.

To get the actual degrees value from this cartesian, you need to use the following function: y = arctan(x), where x is an X/Y component and y is the given component in radians, or you can multiply the function by 180/π to get the value in degrees. The arcsin function is very close and should give extremely similar values to the arctan function.

The mapping readouts from SRanipal max out around ~-30 to ~30 degrees, so they never exceed ~0.68 on the coordinate system. Furthermore the maximum theoretical value of ~-45 to ~45 should still map on the coordinate system (All Quest Pro modules map their quaternions to normalize to the coordinate system).

This is where the function was derived from:
image

Source:
https://pmc.ncbi.nlm.nih.gov/articles/PMC7527608/

@Happyrobot33
Copy link
Author

Happyrobot33 commented Mar 3, 2025

So from what I can tell, if you do a direct 45 degree linear mapping, youll end up with at most -4 degrees of offset at the middle of travel to either side
image

@Happyrobot33
Copy link
Author

Honestly I think a better fix for this is just providing a new set of parameters that are normalized in degrees instead of this not very useful normalization that is currently provided

@Happyrobot33
Copy link
Author

Happyrobot33 commented Mar 3, 2025

seems like this may partly exist already but reading this array setup is hurting my head, the comment also makes it seem like this may not occur when the other parameters exist. Also the parameter isnt documented

https://github.com/benaclejames/VRCFaceTracking/blob/46453fcda63fdcfc56663813246764fbc028d73b/VRCFaceTracking.Core/Params/Expressions/UnifiedExpressionsParameters.cs#L36

it also doesnt seem normalized

@regzo2
Copy link
Contributor

regzo2 commented Mar 4, 2025

Honestly I think a better fix for this is just providing a new set of parameters that are normalized in degrees instead of this not very useful normalization that is currently provided

The eye parameters are just not just normalized linearly, Are you wanting a set of parameters that are linearly normalized?

@regzo2
Copy link
Contributor

regzo2 commented Mar 4, 2025

seems like this may partly exist already but reading this array setup is hurting my head, the comment also makes it seem like this may not occur when the other parameters exist. Also the parameter isnt documented

https://github.com/benaclejames/VRCFaceTracking/blob/46453fcda63fdcfc56663813246764fbc028d73b/VRCFaceTracking.Core/Params/Expressions/UnifiedExpressionsParameters.cs#L36

it also doesnt seem normalized

These are only for native eye tracking, these are not user accessible directly (ex.: You would essentially be working with the eye bones themselves if you wanted to have any interaction in Unity/VRChat).

@Happyrobot33
Copy link
Author

Honestly I think a better fix for this is just providing a new set of parameters that are normalized in degrees instead of this not very useful normalization that is currently provided

The eye parameters are just not just normalized linearly, Are you wanting a set of parameters that are linearly normalized?

Ideally yes because them being mapped non linearly makes making animations based off of of them difficult since you have to bake in the atan curve into a animation

@Happyrobot33
Copy link
Author

made a mockup PR for this benaclejames/VRCFaceTracking#263

Cant figure out how to build the project yet as the project just errors when I do so

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants