-
Notifications
You must be signed in to change notification settings - Fork 15
Expressivity Parameters
People vary a lot in nonverbal communication, depending on their personality, environmental situation, social rules, etc. For example, people with an extroverted personality smile and look more; or a person may gesture a lot while talking due for example to their nationality and/or their actual emotional state and/or any another kind of influence. Argyle [1] and Gallaher [2] state that there is a kind of underlying tendency which is constantly present in each person’s behavior. People that tend to look more and perform a lot of gestures will continue to do so in most situations.
Hartmann et al. [3] have defined the expressivity of behavior over 6 dimensions:
-
Overall Activity - OAC: amount of activity (e.g., passive/static versus animated/engaged). This parameter influences the number of single behaviors occurring during the communication. For example, as this parameter increases, the number of head movements, facial expressions, gestures and so on, increases. Its value is a floating point number ranging from 0 to 1 where a value of zero corresponds to no activity, and a value of one corresponds to maximum activity.
-
Spatial Extent - SPC: amplitude of movements (e.g., expanded versus contracted). This parameter determines the amplitude of, for example, head rotations and gestures. The attribute, like all the following, is a real number defined in the interval [−1,1]:
- a value of 0 corresponds to a neutral behavior, that is, the behavior of the agent without any expressivity control; in such a case, the agent performs nonverbal signals with the amplitude that was defined by the system designer.
- a value of −1 corresponds to the reproduction of very small and contracted movements
- a value of 1 corresponds to very wide and large movements.
-
Temporal Extent - TMP: duration of movements (e.g., quick versus sustained actions). This parameter modifies the speed of execution of movements. They are slow if the value of the parameter is negative, or fast when the parameter is positive. The effects of the TMP parameter on the calculation of the agent’s movements is different depending on the involved modalities. For facial parameters, head or torso the variation of the TMP parameter influences the time the movement needs to be activated, that is, the time, and so the speed, the facial parameters, head or torso need to reach their position for producing a given signal and go back to the relaxed position. For the gesture we consider only the duration of the stroke and we determine the timing of the gesture phases (preparation, hold, retraction) by modulating the arm speed depending on the TMP parameter. A gesture with high TMP will reach the preparation position faster than a gesture with low TMP. Also the stroke will be executed faster.
-
Fluidity - FLD: smoothness and continuity of movement (e.g., smooth, graceful versus sudden, jerky). Higher values allow smooth and continuous execution of movements while lower values create discontinuity in the movements.
- Power - PWR: dynamic properties of the movement (e.g., weak/relaxed versus strong/tense). Higher (resp. lower) values increase (resp. decrease) the acceleration of the head or limbs rotation, making the overall movement look more (resp. less) powerful. Increasing this parameter also produces movement overshooting.
- Repetitivity - REP: this parameter permits the generation of rhythmic repetitions of the same rotation/expression/gesture. For example, a head nod with a high repetitivity becomes a sequence consisting of very fast and small nods.
Via these dimansions we identify the external, visible qualities of movement, like its speed, amplitude, fluidity and so on. Expressivity is an integral part of the communication process as it can provide information on the emotional state, mood and personality of the person.
In the Greta platform this expressivity parameters are used to both define the agent's general beavior tendency (Baseline) and the local behavior tendencies (DynamicLine).
[1] M. Argyle. Bodily Communication. Methuen & Co., London, 2nd edition, 1988.
[2] P. E. Gallaher. Individual differences in nonverbal behavior: Dimensions of style. Journal of Personality and Social Psychology, 63(1):133–145, 1992.
[3] B. Hartmann, M. Mancini, S. Buisine, and C. Pelachaud. Design and evaluation of expressive gesture synthesis for embodied conversational agents. In Third International Joint Conference on Autonomous Agents & Multi-Agent Systems, Utretch, July 2005.
Advanced
- Generating New Facial expressions
- Generating New Gestures
- Generating new Hand configurations
- Torso Editor Interface
- Creating an Instance for Interaction
- Create a new virtual character
- Creating a Greta Module in Java
- Modular Application
- Basic Configuration
- Signal
- Feedbacks
- From text to FML
- Expressivity Parameters
- Text-to-speech, TTS
-
AUs from external sources
-
Large language model (LLM)
-
Automatic speech recognition (ASR)
-
Extentions
-
Integration examples
Nothing to show here