J. Allwood and E. Ahlsen, Learning how to manage communication, with special reference to the acquisition of linguistic feedback, Journal of Pragmatics, issue.10, p.1353, 1999.

M. Boholm and J. Allwood, Repeated head movements, their function and relation to speech, Proceedings of LREC workshop on multimodal corpora advances in capturing coding and analysing multimodality, pp.6-10, 2010.

J. C. Abric, Psychologie de la communication: théories et méthodes. A. Colin, vol.1, 1996.

T. Baltru?aitis, P. Robinson, and L. P. Morency, Openface: an open source facial behavior analysis toolkit, Applications of Computer Vision (WACV), 2016 IEEE Winter Conference on, pp.1-10, 2016.

M. R. Barrick and M. K. Mount, The big five personality dimensions and job performance: a meta analysis, Personnel psychology, vol.44, issue.1, pp.1-26, 1991.

J. B. Bavelas, L. Coates, and T. Johnson, Listener responses as a collaborative process: The role of gaze, Journal of Communication, vol.52, issue.3, pp.566-580, 2002.

J. Bavelas, J. Gerwing, and S. Healing, Doing mutual understanding. Calibrating with micro-sequences in face-to-face dialogue, Journal of Pragmatics, vol.121, pp.91-112, 2017.

J. Bavelas, J. Gerwing, and S. Healing, Doing mutual understanding. Calibrating with micro-sequences in face-to-face dialogue, Journal of Pragmatics, vol.121, pp.91-112, 2017.

J. B. Bavelas and J. Gerwing, The listener as addressee in face-to-face dialogue, International Journal of Listening, vol.25, issue.3, pp.178-198, 2011.

J. Bavelas, J. Gerwing, C. Sutton, and D. Prevost, Gesturing on the telephone: Independent effects of dialogue and visibility, Journal of Memory and Language, vol.58, issue.2, pp.495-520, 2008.

J. B. Bavelas and N. Chovil, Visible acts of meaning: An integrated message model of language in face-to-face dialogue, Journal of Language and social Psychology, vol.19, issue.2, pp.163-194, 2000.

J. B. Bavelas, N. Chovil, L. Coates, and L. Roe, Gestures specialized for dialogue. Personality and social psychology bulletin, vol.21, pp.394-405, 1995.
DOI : 10.1177/0146167295214010

R. Bertrand, P. Blache, R. Espesser, G. Ferré, and C. Meunier,

S. Rauzy, Le CID-Corpus of Interactional Data-Annotation et exploitation multimodale de parole conversationnelle, vol.49, 2008.
URL : https://hal.archives-ouvertes.fr/hal-00349893

M. Boholm and J. Allwood, Repeated head movements, their function and relation to speech, Proceedings of LREC workshop on multimodal corpora advances in capturing coding and analysing multimodality, pp.6-10, 2010.

J. Carletta, A. Isard, J. Kowtko, and G. Doherty-sneddon, HCRC dialogue structure coding manual, Human Communication Research Centre, 1996.

H. H. Clark and R. J. Gerrig, Quotations as demonstrations. Language, pp.764-805, 1990.
DOI : 10.2307/414729

L. Cerrato and M. Skhiri, A method for the analysis and measurement of communicative head movements in human dialogues, AVSP 2003-International Conference on Audio-Visual Speech Processing, 2003.

N. Chovil, Communicative Functions of Facial Displays in Conversation, 1989.

J. P. De-ruiter, Postcards from the mind: The relationship between speech, imagistic gesture, and thought, Gesture, vol.7, issue.1, pp.21-38, 2007.

M. ;. De-fornel, T. U. «alors, and . Me-vois?», Objet technique et cadre interactionnel dans la pratique visiophonique, Centre de recherche sur la culture technique, The expression of the emotions in man and animals, 1992.

P. Ekman and W. V. Friesen, Measuring facial movement. Environmental psychology and nonverbal communication, 1976.

K. Eberhard and H. Nicholson, Coordination of understanding in face-toface narrative dialogue, Proceedings of the Annual Meeting of the Cognitive Science Society, vol.32, 2010.

B. Fasel and J. Luettin, Automatic facial expression analysis: a survey, Pattern recognition, vol.36, issue.1, pp.259-275, 2003.

R. E. Fan, K. W. Chang, C. J. Hsieh, X. R. Wang, and C. J. Lin, LIBLINEAR: A library for large linear classification, Journal of machine learning research, vol.9, p.1871, 2008.

G. Rizzolatti, L. Fadiga, V. Gallese, and L. Fogassi, Premotor cortex and the recognition of motor actions, Cognitive brain research, vol.3, issue.2, pp.131-141, 1996.

R. Gardner, Between Speaking and Listening: The Vocalisation of Understandings1, Applied linguistics, vol.19, issue.2, pp.204-224, 1998.

B. Granström and D. House, Audiovisual representation of prosody in expressive speech communication, Speech communication, vol.46, issue.3-4, pp.473-484, 2005.

C. Goodwin, Between and within: Alternative sequential treatments of continuers and assessments, Human studies, vol.9, issue.2-3, pp.205-217, 1986.

H. P. Graf, E. Cosatto, V. Strom, and F. J. Huang, Visual prosody: Facial movements accompanying speech, Automatic Face and Gesture Recognition, pp.396-401, 2002.

A. Ito, X. Wang, M. Suzuki, and S. Makino, Smile and laughter recognition using speech processing and face recognition from conversation video, Cyberworlds, 2005. International Conference on, p.8, 2005.

P. Keating, M. Baroni, S. Mattys, R. Scarborough, A. Alwan et al., Optical phonetics and visual perception of lexical and phrasal stress in English, Proceedings of the 15th International Congress of Phonetic Sciences (ICPhS), pp.2071-2074, 2003.

A. Kießling, R. Kompe, H. Niemann, E. Nöth, and A. Batliner, Sorry"," I'm still listening": dialog guiding signals in information retrieval dialogs, ESCA Workshop on Prosody, 1993.

M. Mccarthy, Talking back:" Small" interactional response tokens in everyday conversation, Research on language and social interaction, vol.36, issue.1, pp.33-63, 2003.

D. Mcneill, Hand and mind: What gestures reveal about thought, 1992.

M. Krych-appelbaum, J. B. Law, D. Jones, A. Barnacz, A. Johnson et al., I think I know what you mean": The role of theory of mind in collaborative communication, Interaction Studies, vol.8, issue.2, pp.267-280, 2007.

K. G. Munhall, J. A. Jones, D. E. Callan, T. Kuratate, and E. Vatikiotis-bateson, , 2004.

, Visual prosody and speech intelligibility: Head movement improves auditory speech perception, Psychological science, vol.15, issue.2, pp.133-137

I. Mushin, L. Stirling, J. Fletcher, and R. Wales, Identifying prosodic indicators of dialogue structure: some methodological and theoretical considerations, Proceedings of the 1st SIGdial workshop on Discourse, vol.10, p.36, 2000.

C. Navarretta and M. Lis, Northern European Association for Language and Technology; 4th Nordic Symposium on Multimodal Communication, NEALT Proceedings, pp.15-16, 2013.

. Sweden, , pp.55-62

N. Obin, Cries and whispers-classification of vocal effort in expressive speech, 2012.
URL : https://hal.archives-ouvertes.fr/hal-00706261

A. Özyürek, Language in our hands: The role of the body in language, cognition and communication, pp.637-663, 1999.

E. Shriberg, A. Stolcke, D. Jurafsky, N. Coccaro, M. Meteer et al., Can prosody aid the automatic classification of dialog acts in conversational speech, Language and speech, vol.41, issue.3-4, pp.443-492, 1998.

H. Tanaka and N. Campbell, Acoustic features of four types of laughter in natural conversational speech, Proc. 17th International Congress of Phonetic Sciences (ICPhS), pp.1958-1961, 2011.

C. Shannon, A Mathematical Theory of Communication-The Bell System Technical Journal, vol.27, pp.623-656, 1948.

N. Ward, Non-lexical conversational sounds in American English, Pragmatics & Cognition, vol.14, issue.1, pp.129-182, 2006.
DOI : 10.1075/pc.14.1.08war

URL : http://www.cs.utep.edu/nigel/papers/discoursep.pdf

E. Weigand, Misunderstanding: The standard case, Journal of pragmatics, vol.31, issue.6, pp.763-785, 1999.

. .. Sommaire,

.. .. Le-dialogue,

.. .. Analyses,

. , Partie 2-Constitution d'un corpus audio-visuel

C. and .. .. Faciale,

.. .. Marqueur-haussement-de-sourcils,

. .. Conclusion, , p.61

. .. Bibliographie,

.. .. Sigles,