Continuous recognition of player’s affective body expression as dynamic quality of aesthetic experience



Download 192.99 Kb.
Page8/8
Date18.10.2016
Size192.99 Kb.
#2581
1   2   3   4   5   6   7   8

VI.Conclusions


In conclusion, we have proposed to measure the affective content of players’ body expressions as a first step towards measuring the players’ aesthetic experience in the context of whole-body games. The approach is based on evidence of the effects of proprioceptive feedback on the experience of the player. In this paper, we propose that the aesthetic experience constitutes itself as a result of a shift through different types of affective engagement that is triggered by the proprioceptive feedback derived by the way the players’ body is involved in the game. As a first step towards measuring aesthetic experience, we have presented a system that automatically detects the affective body expressions of the players. By continuously detecting the affective expressions conveyed by the player’s body, the system has access to the variety of emotions the players are going through or assigning (i.e., creating meaning) to the gameplay events, which represents the rhythmic dynamics of the overall experience.

Our system’s performance reached an overall accuracy of 61.1% when tested at game-point window level that is comparable to the one of the observers’ agreement (61.49%) and well above chance level (25%). The performance on 10-frame window level (58.4%) is also quite interesting as it is well above chance level. The results are comparable to the results obtained for complex expressions in the acted and non-acted studies discussed in the introduction and on continuous non-acted labelling for other modalities [70], [71].

Bernhardt et al. [52] is one of the studies we can compare with ours since they used motion data instead of single postures. The researchers used arm movement features to recognize emotions from ‘knocking’ movements reaching similar performance with our system (59% accuracy), when individual idiosyncrasies were not considered. As we pointed out in Fig. 4, the players showed that not only did they have their own idiosyncrasy but they employed different strategies when playing. Whereas our features were normalized to individual player, a more careful subtraction of such differences could provide better performances. It could be for example interesting to consider how motivations may change the player’s playing strategies over different games. Furthermore, in this study the labelling was carried out at game-point level. The system would probably reach higher performances if the labelling was also done at continuous or tennis-shot level.

Further improvement could be also obtained through a better selection of features. In this paper, we only considered dynamic features. It is possible that by combining both configurational and dynamic features better performance may be obtained. As configurational features in particular, but also dynamic features, may be affected by the type of action performed (e.g., backhand vs. smash), it may be necessary to consider combining low-level features into high-level features. There are in fact evidences that emotion recognition from body movement is based on global features whereas action recognition is based on local features [75]. Finally, the features were computed on small window size, relying on the RNN to capture the dynamical changes over the game window. Adding features that describe the overall trends over larger windows may help to better capture the differences between emotional expressions.

Moreover, the future development of our study, as well as a future deeper analysis of our results, may put into focus the difference between the way males and females express emotions through their body movements. As a matter of fact, our results have just started putting into focus the difference between the way males and females express emotions through their body movements, as we only had one female taking part in our study. This first step towards the study of gender differences in expressing affective bodily states through movement may have important consequences for the investigation of aesthetic experience and its underlying felt quality both in a video-game scenario and in HCI in general. It could allow us to consider aesthetic emotion and experience as being gender-specific. In fact, if the emotional variety which an aesthetic experience is made of depends on the different body dynamics between the two genders, it also follows that the aesthetic experience and hence the direct immediate perception of aesthetic emotion could be considered gender-specific. This is a novel idea with promising future development, as, traditionally, aesthetic perception and experience has always been considered to be a gender-neutral issue [73].

Other than these technical improvements, the future development of our approach is to find a way to recognize the underlying and unifying aesthetic emotion characterizing an aesthetic experience. What is needed is to identify the underlying relation between body movement and affective states that is likely the medium of the unifying aesthetic emotion and hence of the satisfying level of an experience that qualifies it as aesthetic. In other words, the future aim is to find out what motion features can be assigned to aesthetic emotion and hence to the satisfying aesthetic experience. The role of the duration of each emotional phase (joy, happiness, anger, frustration, etc.), constituting the whole experience, may contribute to determining the perception of the sense of intrinsic completion of an experience, which is enjoyed as aesthetic and which is its closure, as it was argued in this paper.


Appendix


This glossary provides the definitions used in this paper for terms related to aesthetic experience. These definitions are based on pragmatist aesthetics inspired by the works of John Dewey [16], [17] and on the works on game aesthetics [43]-[45] we used in this research article.

Aesthetic emotion: An aesthetic emotion corresponds to the felt qualitative unity of an experience that is experienced for its own sake. This underlying unifying quality of an experience is immediately felt as such in perception and identified as aesthetic emotion.
Aesthetic enjoyment: Aesthetic enjoyment is enjoyment of the disinterested contemplation of the fullness of an object or an action.
Aesthetic experience: An aesthetic experience is a consummatory experience, which completes itself and carries with it its own individualizing quality. It is characterized by the awareness of a process brought to fulfillment through a medium of activity. It is the integration of all the elements of ordinary experience that gives the experience a larger feeling of wholeness in the interactive flow of organism-environment transactions.
Engagement: Bodily and emotional commitment to perform an action in a game.
Entertainment: The act of affording amusement
Game aesthetics: Game aesthetics has three core meanings. It refers to a) the sensory visual, haptic and embodied phenomena that the player encounters in the game; b) those aspects of digital games that are shared with art forms; c) to an expression of the game experienced as pleasure, emotion, sociability and form-giving.

References


  1. M. Graves Petersen, O.S. Iversen and P.G. Krogh, “Aesthetic Interaction – A Pragmatist’s Aesthetics of Interactive Systems”. Proceedings on Designing Interactive Systems, 2004.

  2. J.H. Kim, D.V. Gunn, E. Schuh, B. Phillips, R.J. Pagulayan, and D. Wixon, “Tracking real-time user experience (TRUE): a comprehensive instrumentation solution for complex systems”, Proceedings of the 26th Annual SIGCHI Conference On Human Factors In Computing Systems, New York, NY, USA. ACM, 2008, pp. 443–452.

  3. J.L. Elman, “Finding Structure in Time”, Cognitive Science, vol. 14, 1990, pp.179-211.

  4. J.H. Kim, D.V. Gunn, E. Schuh, B. Phillips, R.J. Pagulayan, and D. Wixon, “Tracking real-time user experience (TRUE): a comprehensive instrumentation solution for complex systems”, Proceedings of the 26th Annual SIGCHI Conference On Human Factors in Computing Systems, New York, NY, USA. ACM, 2008, pp. 443–452.

  5. F. Muller and N. Bianchi-Berthouze, “Evaluating Exertion Games Experiences from Investigating Movement Based”, Springer, Human-Computer Interaction Series, Part 4, 2010, pp. 187-207.

  6. N. Bianchi-Berthouze, “Understanding the role of body movement in player engagement”, Human-Computer Interaction, (in press). http://web4.cs.ucl.ac.uk/uclic/people/n.berthouze/BerthouzeHCI12.pdf

  7. K. Isbister, U. Schwekendiek, and J. Frye, “Wriggle: An Exploration of Emotional and Social Effects of Movement”, Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, 2011, 1885-1890.

  8. A. Melzer, I. Derks, J. Heydekorn, G. Steffgen, “Click or Strike: Realistic versus Standard Game Controls in Violent Video games and Their Effects on Agression”, In H.S Yang, et al. (Eds) Entertainment Computing, LNCS 6243, Springer: Berlin, 2010, pp. 171-182.

  9. J.H. Brockmyer, C.M. Fox, K.A. Curtiss, E. McBroom, K.M. Burkhart and J.N. Pidruzny, “The development of the Game Engagement Questionnaire: A measure of engagement in video game-playing”, J. of Experimental Social Psychology, vol. 45, 4, 2009, pp. 624-634.

  10. J. Hoysniemi, “International survey on the Dance Dance Revolution game”, Computer Entertainment. Section: Games, user interface and performing arts, vol. 4, 2, 8, 2006.

  11. M. Slater, A. Steed, J. McCarthy and F. Maringelli, “The influence of body movement on subjective presence in virtal environments”, Human Factors, 40, 1998, pp. 469–477.

  12. M. Slater, M. Usoh, and A. Steed, “Taking steps: the influence of a walking metaphor on presence in virtual reality”, ACM Transactions on Computer-Human Interaction, 2, 1995, pp. 201–219.

  13. N. Lazzaro, “Why we play games: Four keys to more emotion without story”, Technical report, XEO Design Inc., 2004.

  14. M.G. Petersen, “Aesthetics of Interaction – A Pragmatist Perspective”, Proceedings Workshop on Aesthetic Approaches to Human-Computer Interaction, 2004, pp. 31- 32.

  15. B. Schuller, M.F. Valstar, F. Eyben, G. McKeown, R. Cowie, M. Pantic: AVEC 2011-The First International Audio/Visual Emotion Challenge” International Conference on Affective Computing and Intelligent Interaction, Part 2, LNCS 6975, Springer, 2011, pp. 415-424

  16. T., Alexander, “John Dewey’s Theory of Art, Experience and Nature”. The Horizons of Feeling, New York: SUNY, 1987.

  17. J. Dewey, “Art as Experience.Perigee Book, 2005 [1934].

  18. J. Chandler and N. Schwarz, “How extending your middle finger affects your perception of others: Learned movements influence concept accessibility”, Journal of Experimental Social Psychology, vol. 45, no. 1, 2009, pp. 123-128.

  19. J.D. Laird, “Self-attribution of emotion: The effects of expressive behavior on the quality of emotional experience”, Journal of Personality and Social Psychology, vol. 29, 4, 1974, pp. 475-486.

  20. J.H. Riskind, and C.C. Gotay, “Physical posture: Could it have regulatory or feedback effects on motivation and emotion?”, Motivation and Emotion, vol. 6, 3, 1982, pp. 273–298.

  21. S. Stepper, and F. Strack, “Proprioceptive determinants of emotional and nonemotional feelings”, Journal of Personality and Social Psychology, 64, 1993, pp. 211–220.

  22. G.L., Wells, and R.E. Petty, “The effects of overt head movements on persuasion: Compatibility and incompatibility of responses. Basic and Applied Social Psychology, 1, 1980, pp. 219–230.

  23. J.T. Cacioppo, J.R. Priester, and G.G. Berntson, “Rudimentary determination of attitudes: II. Arm flexion and extension have differential effects on attitudes”, Journal of Personality and Social Psychology, vol. 65, 1993, pp. 5–17.

  24. R. Neumann, and F. Strack, “Approach and avoidance : The influence of proprioceptive and exteroceptive cues on encoding of affective information”, Journal of personality and social psychology, vol. 79, 1, 2000, pp. 39-48.

  25. P.M., Niedenthal, L.W. Barsalou, P. Winkielman, S. Krauth-Gruber, and F. Ric, “Embodiment in Attitudes, Social Perception, and Emotion”, Personality and Social Psychology Review, vol. 9, 3, 2005, pp. 184–211.

  26. D.R. Carney, A.J.C. Cuddy, A.J. Yap, “Power posing: Brief nonverbal displays affect neuroendocrine levels and risk tolerance”, Psychological Science, vol. 21, 10, 2010, pp. 1363-1368

  27. J. Cole, and B. Montero, “Affective Proprioception”, Janus Head, 9, 2, 2007, pp. 299-317.

  28. J.Y. Park, Y.C. Lee,   B.S. Jo and C.W. Lee, “Virtual playing ground interface using upper-body gesture recognition”, Workshop on Frontiers of Computer Vision, 2011, pp. 1-5.

  29. J.Y. Park, and J.H. Yii, “ Gesture Recognition Based Interactive Boxing Game”, International Conference on Intelligent Computing, 2005.

  30. A. Bleiweiss, D. Eshar, G. Kutliroff, A. Lerner, Y. Oshrat, and Y. Yanai, “Enhanced interactive gaming by blending full-body tracking and gesture animation”, ACM SIGGRAPH ASIA, 2010

  31. M. Roccetti, G. Marfia, and A. Semeraro, “A Fast and Robust Gesture Recognition System for Exhibit Gaming Scenarios”, International Workshop on Distributed Simulation and Online Gaming, 2011

  32. M. Roccetti, G. Marfia, and A. Semeraro, “Playing into the Wild: A Gesture-based Interface for Gaming in Public Spaces”, Journal of Visual Communication and Image Representation, Elsevier, vol. 23, 3, 2012, pp. 426-440

  33. K. Isbister, “Enabling social play: A framework for design and evaluation”, In Evaluating User Experiences in Games: Concepts and Methods, R . Bernhaupt, ed. Springer, London, 2010

  34. E. Brown, and P. Cairns, “A grounded investigation of game immersion”, In the proceedings of the ACM Conference on Human Factors in Computing, 2004, 1297-1300.

  35. L. Ermi, and F. Mäyrä, Fundamental components of the gameplay experience: analysing immersion. In the proceedings of the DiGRA conference Changing views: worlds in play, 2005, pp. 15 – 27.

  36. G.N. Yannakakis, and J. Hallam, “Entertainment Modeling through Physiology in Physical Play,” International Journal of Human-Computer Studies, 66, 10, 2008, pp. 741-755.

  37. M. Pasch, N. Bianchi-Berthouze, B. van Dijk, and A. Nijholt, “Movement-based Sports Video Games: Investigating Motivation and Gaming Experience”, Entertainment Computing, vol. 9, 2, 2009, pp. 169-180.

  38. J. Nijhar, N. Bianchi-Berthouze, G. Boguslawski, “Does Movement Recognition Precision affect the Player Experience in Exertion Games?”, International Conference on Intelligent Technologies for interactive entertainment (INTETAIN’11), LNICST 78, 2012, pp. 73-82.

  39. D. Kirsh, and P. Maglio, “On distinguish epistemic from pragmatic action”, Cognitive Science, vol. 18, 1984, pp. 513-549.

  40. P.P. Maglio, M.J., Wenger, and A.M. Copeland, “Evidence for the role of self-priming in epistemic action: Expertise and the effective use of memory”, Acta Psychologica, 127, 1, (2008), pp. 72-88.

  41. N. Shaker, S. Asteriadis, G.N. Yannakakis, and K. Karpouzis, “A Game-based Corpus for Analysing the Interplay between Game Context and Player Experience,” in Affective Computing and Intelligent Interaction Conference; Emotion in Games Workshop, Springer, 2011

  42. M. Csikszentmihalyi, “Flow: The Psychology of Optimal Experience”, Harper Perennial, 1991

  43. J. MacGonigal, “This is not a Game: Immersive Aesthetics and Collective Play”, DAC, Melbourne, 2003

  44. S. Niedenthal, “What we talk about when we talk about game aesthetics”, Int. Conf. on Digital Game Research Association, 2009

  45. H. Sommerseth, “Exploring Game Aesthetics”, Int. Conference on Digital Game Research Association, 2009

  46. H.G. Wallbott, “Bodily expression of emotion”, European Journal of Social Psychology, 28, 1998, pp. 879-896.

  47. M. Coulson, “Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence”, Journal of Nonverbal Behavior, 28, 2004, pp. 117-139.

  48. A. Kleinsmith, R. De Silva, and N. Bianchi-Berthouze, “Cross-cultural differences in recognizing affect from body posture”, Interacting with Computers, 18, 6, 2006, pp. 1371-1389.

  49. R. De Silva, and N. Bianchi-Berthouze, “Modeling human affective postures: An information theoretic characterization of posture features”, Journal of Computational Animation and Virtual Worlds, vol. 15, 3-4, 2004, pp. 269-276.

  50. B. de Gelder, “Why bodies? Twelve reasons for including bodily expressions in affective neuroscience”, Philosophical Transactions of the Royal Society, vol. 364, 3, 2009, pp. 3475-3484.

  51. C. Roether, and L. Omlor, A. Christensen, and M.A. Giese, “Critical features for the perception of emotion from gait”, Journal of Vision, vol. 8, 6, 15, 2009, pp. 1-32.

  52. D. Bernhardt, and P. Robinson, “Detecting affect from non-stylised body motions”, Affective Computing and Intelligent Interaction, 2007.

  53. G. Castellano, S. Villalba, and A. Camurri, “Recognising human emotions from body movement and gesture dynamics”, In A. Paiva, R. Prada, R. W. Picard (Ed.), Affective Computing and Intelligent Interaction, LNCS 4738, 2007, pp. 71-82,.

  54. A. Kleinsmith, T. Fushimi and N. Bianchi-Berthouze, “An incremental and interactive affective posture recognition system”, International Workshop on Adapting the Interaction Style to Affective Factors, 2005

  55. A. Kleinsmith and N. Bianchi-Berthouze, “Recognizing Affective Dimensions from Body Posture”, Affective Computing and Intelligent Interaction, Part I, Springer, LNCS 4738, 2007, pp. 48-58.

  56. A. Camurri, B.M. Mazzarino, R., Timmers, and G. Volpe, “Multimodal analysis of expressive gesture in music and dance performances”, Gesture-based Communication in HCI, 2004, pp. 20-39.

  57. H. Gunes and M. Piccardi, “Bi-modal emotion recognition from expressive face and body gestures”, Journal of Network and Computer Applications, 30, 2007, pp. 1334-1345.

  58. A. Camurri, B. Mazzarini, and G. Volpe, “Analysis of expressive gesture: The eyesweb expressive gesture processing library”, Gesture-based Communication in HCI, LNAI 2915, (2004), pp. 460-467.

  59. F. Pollick, H. Paterson, A. Bruderlin, and A. Sanford, “Perceiving affect from arm movement”, Cognition, vol. 82, 2001, pp. 51–61.

  60. A. Kleinsmith and N. Bianchi-Berthouze, “Affective Body Expression Perception and Recognition: A Survey”, IEEE Transaction on Affective Computing, (in press).

  61. A. Kleinsmith, N. Bianchi-Berthouze, and A. Steed, “Automatic Recognition of Non-Acted Affective Postures,” IEEE Trans. on Systems, Man, and Cybernetics Part B, vol. 41, 4, 2011, pp. 1027-1038.

  62. A. Kleinsmith, N. Bianchi-Berthouze, “Form as a Cue in the Automatic Recognition of Non-Acted Affective Body Expressions", LNCS 6975, Springer, 2011, pp. 155-164.

  63. M. Kitagawa and B. Windsor, “MoCap for Artists: Workflow and Techniques for Motion Capture”. Focal Press, 2008, pp. 190-194.

  64. C. Storm and T. Storm, “A taxonomic study of the vocabulary of emotions”, Journal of Personality and Social Psychology, vol. 53, no. 4, 1987, pp. 805–816.

  65. R. von Laban, Modern educational dance. MacDonald & Evans, 1963.

  66. N. Savva, N. Bianchi-Berthouze, “Automatic recognition of affective body movement in a video game scenario”, A. Camurri, C. Costa, and G. Volpe (Eds.), International Conference on Intelligent Technologies for interactive entertainment (INTETAIN’11), LNICST 78, 2012, pp. 149-158.

  67. J.L. Elman, “Finding Structure in Time”, Cognitive Science, 14, 1990, pp.179-211.

  68. S. Haykin, “Neural Networks: A Comprehensive Foundation”, 2nd Edition, Prentice-Hall, 1999, pp. 754-777.

  69. M. Bodén, “A guide to recurrent neural networks and backpropagation”, in The DALLAS project. Report from the NUTEK-supported project AIS-8: Application of Data Analysis with Learning Systems, 1999-2001. Holst, A. (ed.), SICS Technical Report T2002:03, SICS, 2001.

  70. H. Meng and N. Bianchi-Berthouze, “Naturalistic Affective Expression Classification by a Multi-stage Approach based on Hidden Markov Models”, Affective Computing and Intelligent Interfaces, Part II, LNCS 6975, Springer, 2011, pp. 378-387.

  71. G. Ramirez, T. Baltrusaitis, L.P. Morency, “Modeling Latent Discriminative Dynamic of Multi-Dimensional Affective Signals”, Affective Computing and Intelligent Interfaces, Part II, LNCS 6975, Springer, 2011, pp. 396-406.

  72. J.H. Riskind, “They stoop to conquer: Guiding and self-regulatory functions of physical posture after success and failure”, Journal of Personality and Social Psychology, vol. 47(3), 1984, pp. 479-493.

  73. C. Detels, “Whiter Feminist Aesthetics”, Action, Criticism, and Theory for Music Education, 5, 1, 2006.

http://act.maydaygroup.org/articles/Detels5_1.pdf.

  1. P. Atkinson, W.H. Dittrich, A.J. Gemmell, and A.W. Young, “Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures”, Cognition, vol. 104, 2007, pp. 59-72.

  2. A.P. Atkinson, “Impaired recognition of emotions from body movements is associated with elevated motion coherence thresholds in autism spectrum disorders”, Neuropsychologia, 47, 13, 2009, pp. 3023-3029.

Manuscript received November 4, 2011.

N. Savva was with the Computer Science Department, University College London, WC1E6 7BT, United Kingdom (e-mail: nikolaos.savva.09@ucl.ac.uk).

A. Scarinzi, is with Georg-August Universität Göttingen, Germany (e-mail: alfonsinascarinzi@googlemail.com).

N. Bianchi-Berthouze is with the UCL Interaction Center, University College London, Boulder, WC1E6 7BT, United Kingdom (corresponding author: +44 20 76790690; e-mail: n.berthouze@ucl.ac.uk@ucl.ac.uk).



This work was supported by EPSRC grant EP/G043507/1: Pain rehabilitation: E/Motion-based automated coaching.


1 Gimbal lock occurs when two of the three axes of a 3D Cartesian system align together and one degree of freedom is lost. This situation can be easily detected as the difference in rotation values between two consecutive frames for a certain joint axis is too high to be feasible. It represents instead a rotation in the reference system for that axis of a multiple of  radians performed by the motion-capture software when recovering from the gimbal lock problem).


Download 192.99 Kb.

Share with your friends:
1   2   3   4   5   6   7   8




The database is protected by copyright ©ininet.org 2024
send message

    Main page