We are very happy to announce the:
Roundtable on Multimodal Speech Data
Data – Annotation – Strategies of Analysis
Thursday, 25. January 2024
(IG-NG 2.701)
Morning Session:
9:15 – 13:00: Open discussion on Multimodal Speech Data
- Introduction of each group and data set (3 min. per group)
- Practical part: methods, annotation, analysis, tips and tricks
- Discussion: Theory-driven vs. data-driven approaches to multimodal data
Afternoon Session:
Time (CET) |
Speaker |
Title |
14:15-14:35 |
Petra Wagner & Olcay Türk |
Eliciting and Measuring Understanding in Interaction – (Some) Lessons Learned |
14:35-14:55 |
Margaret Zellers |
Placement and temporal alignment of complex gesture strokes in Luganda conversation |
14:55-15:15 |
Sophie Repp & Cornelia Loos |
Gesture in polar responses |
15:15-15:35 |
Susanne Fuchs & Aleksandra Ćwiek |
The Coordination of Dynamic Multimodal Signals in Novel Communication |
15:35 |
coffee |
|
16:05 |
Stefan Bauman |
Head gestures and pitch accents as cues to information status in French |
16:25-16:45 |
MultIS |
On eliciting multimodal data – the challenge between controlling the context and eliciting spontaneous speech and gesture |
16:45-17:15 |
General Discussion |
|