Bimodal Bilingualism


speaking and signing
For speech-sign
bilinguals, the articulators
do not compete

The vast majority of bilingual studies involve two spoken languages. Such “unimodal” bilingualism automatically entails a severe production constraint because one cannot physically produce two spoken words or phrases at the same time. In addition, for unimodal bilinguals both languages are perceived by the same sensory system (audition), whereas for bimodal (speech-sign) bilinguals one language is perceived auditorily and the other is perceived visually. This project investigates how these sensory-motor differences in language modality impact the psycholinguistics of bilingualism, the features of co-speech gesture, and the nature of the bilingual brain. Using behavioral and neuroimaging techniques, we ask the following questions:

  • What are the consequences of removing constraints on simultaneous articulation of two languages?
  • Do bimodal bilinguals code-switch (alternate between languages)?
  • Does bimodal bilingualism affect co-speech gesture?
  • How do bimodal bilinguals control sign language production while speaking?
  • What is the nature of the bimodal bilingual brain?

Funding

This research is supported by the National Institute of Child Health and Human Development (R01 HD047736).

Recent Publications

older publications >>

Recent Presentations

older presentations>>