TY - CHAP KW - automated analysis of non-verbal behavior; expressive gesture analysis; computational models of joint music action N2 - Preliminary results from a study of expressivity and of non-verbal social signals in small groups of users are presented. Music is selected as experimental test-bed since it is a clear example of interactive and social activity, where affective non-verbal communication plays a fundamental role. In this experiment the orchestra is adopted as a social group characterized by a clear leader (the conductor) of two groups of musicians (the first and second violin sections). It is shown how a reduced set of simple movement features - heads movements - can be sufficient to explain the difference in the behavior of the first violin section between two performance conditions, characterized by different eye contact between the two violin sections and between the first section and the conductor. N1 - Third International Conference, ArtsIT 2013, Milan, Italy, March 21-23, 2013, Revised Selected Papers UR - http://dx.doi.org/10.1007/978-3-642-37982-6_16 TI - Towards Automated Analysis of Joint Music Performance in the Orchestra ID - eprints1790 AV - none A1 - Gnecco, Giorgio A1 - Badino, Leonardo A1 - Camurri, Antonio A1 - D?Ausilio, Alessandro A1 - Fadiga, Luciano A1 - Glowinski, Donald A1 - Sanguineti, Marcello A1 - Varni, Giovanna A1 - Volpe, Gualtiero T2 - Arts and Technology SP - 120 T3 - Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering Y1 - 2013/// PB - Springer EP - 127 ER -