Pro­gram

Available for download are:
The program
The book of abstracts
The conference proceedings

The individual papers published via the Paderborn University library are also available following this link.
 

08:30 Registration & Coffee
09:00 Opening 
09:30 

Keynote speaker: Şeyda Özçalışkan

Pointing to words: How gesture provides a helping hand to language development across different learners

(Room: Lecture Hall L1)

10:45 short break
  Paper Session 1: Timing
11:00 Olcay Turk: Does gestural hierarchy align in time with prosodic hierarchy? Another modality to consider: Information structure
11:30 Angela Grimminger: The timing of pointing-speech combinations in typically developing and language-delayed toddlers
12:00  Lunch
13:30  Tutorial Sessions
  Tutorial 2: Katharina Reinecke: The Neuropsychological Gesture (NEUROGES) Analysis System – Behavioral Analysis in Linguistic Research
  Tutorial 3: James P. Trujillo, Wim Pouw: Using video-based motion tracking to quantify speech-gesture synchrony
  Tutorial 4: Natasha Janzen Ulbricht: Teaching Tool Codified Gestures - Can More Pupils Learn More? Codified gestures, theater and their potential for improving abstract concept learning
15:30  Coffee break
  Paper Session 2: Genre
16:00 Nora Schönfelder, Vivien Heller: Embodied reciprocity in conversational argumentation: Soliciting and giving reasons by Palm Up Open Hand gestures
16:30 Alice Cravotta, Pilar Prieto, M. Grazia Busà: Encouraging Gesture Use in a Narration Task Increases Speakers’ Gesture Rate, Gesture Salience and the Production of Representational Gestures
17:00 Ulrich Mertens, Friederike Kern, Stefan Kopp, Olga Abramov, Anne Németh, Katharina J. Rohlfing: Children’s viewpoint: Iconic co-speech gestures and their relation to linguistic structure across two communicative genre
17:30 - 18:30 Poster Session
  1 An exploration of verbal and non-verbal projectability and entrainment in choral productions in English interaction – Marina Noelia Cantarutti
  2 Gestural portray of the public service interpreter: strategies of coping with source messages’ nonverbal cues – Monika Chwalczuk
  3 Adaptation of multimodal communication strategies to noise and failure: evidence from a dyadic interaction task – James Trujillo, Linda Drijvers, Judith Holler and Asli Özyürek
  4 The Visual Communication Heuristic: The Effect of Context on Gesture Production – Jacob Barker and Sotaro Kita
  5 Aging, Working Memory, and Mental Imagery: Understanding gestural communication in younger and older adults – Burcu Arslan, Buse Nur Caba and Tilbe Göksun
  6 Degrees of explicitness in children's iconic gestures – Kristin Weiser-Zurmühlen, Friederike Kern, Ulrich Mertens, Olga Abramov, Anne Németh, Stefan Kopp and Katharina J. Rohlfing
  7 Rhythmic movements with objects at 9 months are related to proximal deictic gestures at 12 months – Eva Murillo, Ignacio Montero, Marta Casla
  8 Multimodal marking of information structure in 4-year-old German children – Sofia Koutalidis, Friederike Kern, Katharina Rohlfing, Stefan Kopp, Olga Abramov, Ulrich Mertens and Anne Németh
  9 Constructional potential of flat-hand–palm-lateral– away-body gestures: a crosslinguistic corpus-based study – Jakub Jehlička and Eva Lehečková
  10 Investigating the coordination of patients’ and therapists’ conceptual phases in hand movements that accompany speech during psychotherapy sessions – Katharina Reinecke, Niklas Neumann and Hedda Lausberg
  11 The relation between individual differences in speech-gesture behaviour of 4-year-olds across three different experimental tasks – Olga Abramov, Stefan Kopp, Katharina Rohlfing, Friederike Kern, Ulrich Mertens and Anne Németh
  12 An Open Source Dataset of Human Gestures Through Human-Robot Interaction – Jan de Wit, Mirjam de Haas, Emiel Krahmer and Paul Vogt
09:00

Keynote speaker: Petra Wagner

Prosody: Cross-modal Interactions of Form and Function

(Room: Lecture Hall L1)

  Paper Session 3: Methods
10:15 Wim Pouw, James A. Dixon: Quantifying Gesture-Speech Synchrony
10:45 Christelle Dodane, Dominique Boutet, Ivana Didirkova, Fabrice Hirsch, Slim Ouni, Aliyah Morgenstern: An integrative platform to capture the orchestration of gesture and speech
11:15 Coffee break
11:45 Data Sessions
   
   
   
13:15 Lunch
14:30

Keynote speaker: Alexis Heloir

Understanding human behavior using virtual humans: lessons learned and upcoming challenges

(Room: Lecture Hall L1)

  Paper Session 4: Management of change
15:45 Mary Amoyal, Béatrice Priego-Valverde: Smiling: a resource for negotiating topic transitions in French conversation
16:15 Margaret Zellers, Jan Gorisch, David House, Benno Peters: Hand gestures and pitch contours and their distribution at possible speaker change locations: a first investigation
   
18:00 Departure to Conference Dinner + Walk
19:00 Conference Dinner: Wald & Wiesencafé
09:00

Keynote speaker: Pilar Prieto

Enacting prosody in the classroom: How the prosody in our hands helps us learn pronunciation in a second language

(Room: Lecture Hall L1)

  Paper Session 5: Coordination
10:15 Katerina Fibigerova, Michèle Guidetti: Gesture-Speech Coordination in Expression of Motion: How Far to Zoom In to Observe Semantic Synchrony?
10:45 Marieke Hoetjes, Lieke van Maastricht, Lisette van der Heijden: Gestural training benefits L2 phoneme acquisition: Findings from a production and perception perspective
11:15 Wim Pouw, Alexandra Paxton, Steven J. Harrison, James A. Dixon: Acoustic Specification of Upper Limb Movement in Voicing
11:45 Loulou Kosmala, Maria Candea, Aliyah Morgenstern: Synchronization of Speech and Gesture: A Multimodal Approach to (Dis)fluency
12:15 Closing Remarks
12:30 Lunch
14:00 End of GeSpIn 2019

Tu­tori­als

Abstract: NEUROGES (Lausberg, 2013; Lausberg, 2018) is an objective and reliable system for the analysis of speech-accompanying hand movements and gestures. A review of 18 empirical studies using NEUROGES in combination with ELAN demonstrates a good reliability of the system (Lausberg and Sloetjes 2015).
Since neuropsychological research provides evidence that some gesture types are generated in the right hemisphere independent from left-hemispheric speech production (Feyereisen, 1987;Kita & Lausberg, 2008; Hogrefe et al., 2010), and the existence of gesture - speech mismatches has been demonstrated (McNeill, 1992; Goldin-Meadow et al., 1993), NEUROGES offers the methodological approach to analyze gestures as a means of expression per se. The ongoing stream of hand movement behavior is segmented and classified according to movement criteria into more and more fine-grained units, resulting in a distinct analysis of gestures based on their visual appearance. The analysis algorithm is shown on http://neuroges.neuroges-bast.info/neuroges-analysis-system and illustrated by video samples.

(max. 12 participants; open for any researcher and student involved in empirical gesture research; 
Participants should bring their own laptop and preferably have ELAN installed).

!! This tutorial session is full!

Abstract: This tutorial is based on a recent paper (Pouw, Trujillo, Dixon, under review) and will address objective and time-effective methods to quantify temporal properties of gesture kinematics and acoustic markers of speech, where we focus on common challenges and possible solutions that come with the complexities of studying multimodal language. In order to provide a comprehensive training experience, our interactive session will include collecting audio-visual recordings and applying machine learning motion tracking analyses, of multimodal (dyadic) interactions, processing and analyzing the data, and creating useful data visualizations. In each step, we will engage participants in discussions on best-practices and encourage active participation. Once collected, data will be made available via file-sharing so participants can actively follow along with all of the data-processing steps. The goal of the tutorial is therefore to provide a basic training in, and stimulate an active discussion about how we as multimodal language researchers can converge on best-practices. We hope to further promote the utilization of new quantitative methods for capturing speech and gesture dynamics.

(max. 15 participants; Participants should bring their own laptop. Software requirements will be send via email in advance.)
 
Reference:
Pouw**, W., Trujillo**, J.P., & Dixon, J.A. (in press). The Quantification of Gesture-speech Synchrony: A Tutorial and Validation of Multi-modal Data Acquisition Using Device-based and Video-based Motion Tracking. Behavior Research Methods. doi: 10.31234/osf.io/jm3hk4

Abstract: There is neurocognitive support for gestures being closely related to spoken language (Willems & Hagoort, 2007) and evidence that gestures support language learning, comprehension and memory (Macedonia & Klimesch, 2014) but how to best use them in diverse classrooms is up for debate. Teachers everywhere are challenged by the need to include children who have very different abilities. Codified classroom gestures to support foreign language word learning may be especially helpful here.
This practical tutorial explores using gestures on the level of morphology in language teaching. It begins by reporting on a theater-based experiment on spatial preposition learning with refugee and grade six pupils in Germany and Poland. Following this input we will use gestures as the primary teaching tool to investigate a traditional story adapted for beginning learners of English.

(max. 20 individuals; open for gesture researchers, linguists as well as teachers and students of education who are interested in multimodal foreign language learning and teaching.)

References:
Macedonia, M., & Klimesch, W. (2014). Long-Term Effects of Gestures on Memory for Foreign   Language Words Trained in the Classroom. Mind, Brain, and Education8, 74–88.
Willems, R. M., & Hagoort, P. (2007). Neural evidence for the interplay between language, gesture, and action: A review. Brain and Language101, 278–289. doi.org/10.1016/j.bandl.2007.03.004