My ongoing research projects explore the connection of hands, language, and the mind including the capacity to develop linguistic structure in gesture systems (home signs), the role of the dyad in language learning, and studies investigating the expression of meaning across verbal and non-verbal modalities.
My projects include studying deception to determine what the hands reveal when we attempt to lie, how ideas are passed in gesture between conversational partners, and gesture's role in language acquisition. I work with a number of populations including adult participants, hearing kids, deaf children who have no access to language models, deaf children who thanks to cochlear implants are now hearing and one very special gentleman, I.W., who cannot feel his body but gestures when he speaks. All of these projects are aimed at better understanding what is the connection of gesture, language, and the mind.
An examination of the relationship between speech and gesture exploits the fact that gestures are unwitting and sensitive manifestations of speaker-internal thought processes. By looking at the synchrony between modalities, gesture may reveal more about thinking than may be expressed by speech. Previous models of language production that include gesture have focused on the expression of a singular representation or idea. This project analyzes gesture and speech when multiple construals of an event are simultaneously expressed. Asynchrony between channels is investigated considering both the meanings expressed in each modality as well as the temporal alignment of speech and gesture. To induce multiple representations of a scene, I ask participants to view a cartoon involving a cat and a bird. Prior to retelling the cartoon to a naive listener, I instruct the participants to deceive their interlocutors by misreporting portions of the cartoon content. I then consider what modality conveys viewed versus instructed information, how often the channels synchronize, and the implications of semantic asynchrony on temporal synchrony. In the second part of the dissertation, I probe gesture-speech asynchronies in data where multiple construals of an event are naturally occurring and removed from deception. Via these currents experiments, I argue that gesture and speech remain a unified system when multiple representations of an event are active in a speaker's mind. While no current model of language production includes gesture-speech mismatches, at their heart, these expressions may not be so different from typical gesture-speech pairings.
Second Language Acquisition (Spring 2009)
Gesture, Cognition and Communication (Fall 2008)
Psychology of Language (Spring 2008)
Language Acquisition (Fall 2007)
Second Language Acquisition (Spring 2004, Winter 2005)
PhD, Linguistics & Psychology, 2007, University of Chicago
Current Collaborative Projects:
Reconceptualizing Medical Ontologies: Merging Cognitive Linguistics and Biomedical Informatics
Interactive Deception and its Detection through Multimodal Analysis of Interviewer-Interviewee Dynamics.
Department of Defense, Counterintelligence Field Activity Credibility Assessment Research Initiative
A Parent-Directed Intervention to Improve Outcomes in Implanted Low SES Children
Papers in Progress:
Franklin, A., McNeill. D, and Goldin-Meadow, S. (manuscript) Liar Liar Hands on Fire: What gesture reveals about deception.
Franklin, A., Giannakidou, A., and Goldin-Meadow, S. (manuscript) (Non)veridicality and structure building in a home sign system.
Franklin, A. (in prep). Interaction in deception: Gesture's role in collaborating to deceive.
Franklin, A., and Goldin-Meadow, S. (in prep) Don't do that! The development of negation in home sign gesture systems.
Franklin, A., Johnson, M. and Goldin-Meadow, S. (in prep) Getting ahead in development: Multi-Modal Acquisition of Negation.