-
Notifications
You must be signed in to change notification settings - Fork 14
Gaze
(Original authors : M. Chollet & D. Simonetti)
Gaze is the one of the nonverbal cues used by humans during interactions. It can be used to communicate, manage attention and so triggering in people social and cognitive process. It can help to facilitate the natural interaction between ECAs and the humans.
In the Greta platform it is possible use the gaze behavior. The gaze model built in the platform is based on the study of (Pejsa, Andrist, Gleicher, & Mutlu, 2015).
The gaze behavior is based on coordinated movements of the eyes, head, shoulder and trunk toward the objects and information in the environment (like, other agents). The model takes as input the position of the target and the orientation of the body parts involved in the movements (eyes, head, shoulder and torso). Given these data and applying a set of kinematic laws derived from measurements of primate gaze reported in neurophysiology research (Guitton & Volle, 1987)(McCluskey & Culle, 2007), the model computes the shifts of the body parts necessary to look at the target. The shifts are achieved by rotating the body joints, distributed into several groups that are controlled jointly. Neck joints (cervical vertebrae) are grouped together under the head. Similar to humans, the rotation of the head involves the simultaneous rotation of the joints that constitute the head. In the same way, shoulders rotation is achieved through simultaneous rotation of the thoracic vertebrae, and so trunk rotation is referred to distributed rotation among lower spine joints (lumbar vertebrae).
The gaze behavior is performed by the agents based on commands received by FML or BML. To process the BML or FML command, the Environment module should be connected to the BehaviorRealizer. This allows to retrieve angles to the target. If it is not connected, the agent will not be able to process gazes with a target but only with offsetDirection and offsetAngles.
In the BML two different type of gazecan be specified:
-
<gaze/>
: to temporarily direct the gaze of the character towards a target; -
<gazeShift/>
: to permanently change the gaze direction of the character towards a certain target.
Below an example of BML command for the simple gaze behavior:
<bml xmlns="http://www.bml-initiative.org/bml/bml-1.0" character="Alice" id="bml1">
<gazeid="gaze1" start="1" end="10" influence="HEAD" target="bluebox"/>
</bml>
The syntax of the command:
The command in the example indicates the agent to gaze towards the target(bluebox). Starting from second 1 the agent moves to reach the target, using eyes and head (influence=”HEAD”), and within second 9 it will come back to the starting position. This behavior causes the character to temporarily direct its gaze to the requested target.
The influence parameter is read as follows: EYE means 'use only the eyes’; HEAD means 'use only head and eyes to change the gaze direction’, etcetera.
Below an example of BML command for the simple gaze behavior:
<bml xmlns="http://www.bml-initiative.org/bml/bml-1.0" character="Alice" id="bml1">
<gazeShift id="gaze1" start="1" end="2" influence="HEAD" target="bluebox"/>
</bml>
The syntax is the same, except for the sync attributes; they will include just “start” (gaze start to move to the new target) and “end” (gaze target acquired).
In this example, the command indicates the agent to change the default gaze direction to be towards the blue box. The shift in gaze takes 1 second to be ready and will employ the eyes and the head. This behavior causes the character to direct its gaze to the requested target. This changes the default state of the agent: after completing this behavior, the new target will be the default gaze direction of the character.
In the FML file the intention to control the gaze behavior of the agent can be included. The command can tell the character to look at something or use the gaze to express an emotional state (i.e. when we are sad, we look down without a precise target).
To just express an emotional state, the intention (performative, emotion, iconic, etc.) can include a gaze expression. As shown in the figure, an intention can include (in the lexicon) a gaze expression whose value is specified in the Facelibrary.
To look at something or someone, the gaze behaviour has to be included in the deictic intention. It can be added as an attribute “target” that gives the ID of the element to gaze at. As shown in the Figure below, the FML file has a deictic intention with a target attribute. In this case there is no gaze expression in the lexicon and in the facelibrary. The attribute target allows to add a new GazeSignal to the list of signals belonging to that deictic intention. Like in the example, the deictic-selftouch intention is made by just gesture signal. The target attribute in the FML file allows to add a gaze signal to look at the “Andre_chair0” in the same time slot (start and end) of gesture signal.
The "target" attribute both in FML and BML is required to gaze at something. In Greta, we allow BML messages that do not contain a "target" attribute but contain both "offsetDirection" and "offsetAngle" attributes: with this kind of messages, it is possible to look at directions defined in the GazeDirection enumeration.
The target attribute should be a reference to leaf of a TreeNode in the Greta Environment. It is possible to access the leaf of that object, have the position in the environment and compute the gaze shift to orientate the agent to the target. Therefore, everything included in the XML TreeNode Environment can be a target for the Gaze. When we add an object, the position of the object should be set to the base position of the object on the x,y,z plane and not at the center of the object itself. This is done since the position of the target is computed during gaze shift by calculating the sum of the y position of the base position of the object and the half height of the object.
Now the TreeNode environment contains just the object in the scene and not the position of the agents and their body parts. In order to gaze the other agents, the wording for the target attribute as to be “Agent:nameAgent” (i.e. target="Agent:DEFAULT_CHARACTER"). The name of the Agent is searched among the character in scene and once found the position of the head and eyes is take in order to compute the right rotation angle.
The influence attribute indicates which body part should be moving in the gaze movement. This attribute should match one of the entries in the Influence enumeration: EYES, HEAD, SHOULDER, TORSO. It is possible chose the influence but can also not be specified and according to the overall rotation the system will compute the necessary influence.
For each influence, one should keep in mind that there is a physical limit to movements. For instance, if you want to look at a target that is behind Greta you should use at least SHOULDER influence.
The angle limits are not totally justified. It is used as reference the study of (Pejsa, Andrist, Gleicher, & Mutlu, 2015) and also a logic based on 'folk's wisdom'.
The offsetDirection and offsetAngle attributes are optional. However, if you wish to use them, then you should provide both, not only a direction or an angle (for now there is no "default" angle or direction, should one of these attributes be missing).
The offsetDirection attribute should match one of the entries in the GazeDirection enumeration: RIGHT, LEFT, UP, DOWN, UPRIGHT, UPLEFT, DOWNLEFT, DOWNRIGHT. The offsetAngle should have a value in degrees.
Example of a gaze command from the character A towards the chair C, using only the eyes:
<bml xmlns="http://www.bml-initiative.org/bml/bml-1.0" id="gazeBMLTest1">
<gaze id="gaze1" start="1.0" end="3.0" influence="EYES"
origin="A" target="C"/>
</bml>
Example of a gaze command that makes the A character look to 30 degrees to the right, using eyes, head and shoulders:
<bml xmlns="http://www.bml-initiative.org/bml/bml-1.0" id="gazeBMLTest1">
<gaze id="gaze1" start="1.0" end="3.0" influence="SHOULDER"
origin="A" offsetDirection="RIGHT" offsetAngle="30" />
</bml>
Example of a gaze command that makes the A character look 10 degrees to the left of a chair C, using the whole body:
<bml xmlns="http://www.bml-initiative.org/bml/bml-1.0" id="gazeBMLTest1">
<gaze id="gaze1" start="1.0" end="3.0" influence="WHOLE"
origin="A" target="C" offsetDirection="LEFT" offsetAngle="10" />
</bml>
Influence of previously computed keyframes (i.e. face nod + gaze)
Known problem : influence of expressivity parameters... head goes further than it should
Guitton, D., & Volle, M. (1987). Gaze control in humans: Eye-head coordination during orienting movements to targets within and beyond the oculomotor range. Journal of Neurophysiology , 427–459.
McCluskey, M. K., & Culle, K. E. (2007). . Eye, head, and body coordination during large gaze shifts in rhesus monkeys: Movement kinematics and the influence of posture. Journal of Neurophysiology , 2976–2991.
Pejsa, T., Andrist, S., Gleicher, M., & Mutlu, B. (2015, March). Gaze and Attention Management for Embodied Conversational Agents. ACM Transactions on Interactive Intelligent Systems, .
Advanced
- Generating New Facial expressions
- Generating New Gestures
- Generating new Hand configurations
- Torso Editor Interface
- Creating an Instance for Interaction
- Create a new virtual character
- Creating a Greta Module in Java
- Modular Application
- Basic Configuration
- Signal
- Feedbacks
- From text to FML
- Expressivity Parameters
- Text-to-speech, TTS
-
AUs from external sources
-
Large language model (LLM)
-
Automatic speech recognition (ASR)
-
Extentions
-
Integration examples
Nothing to show here