Poster presented at CNS New York Apr 1-4 2016
We investigated whether hemispheric organization of word and face recognition is uniquely shaped by sign language experience.
For typically developing hearing individuals, learning to read leads to LH-lateralization for words and may trigger subsequent RH-lateralization for faces. Hemispheric specialization for faces in RH may be contingent on prior lateralization for words in LH (Dundas, Plaut & Behrmann, 2014).
Deaf native users of American Sign Language (ASL) have distinct developmental experiences with both words and faces (e.g., the face conveys linguistic information).
What is the relationship between word and face processing for deaf native users of American Sign Language? How do distinct developmental experiences of deaf signers and hearing non-signers affect hemispheric organization for word and face processing?
In our preliminary report, we report data from 19 hearing non-signers and 23 deaf ASL signers made same-different judgments to pairs of words or faces (192 trials each), where the first stimulus was presented centrally and the second was presented to either the left (LH) or right hemisphere (RH). EEG was recorded to centrally presented words / faces and referenced to the average of all electrode sites.
In addition, we measured accuracy of discrimination between the central and lateralized words / faces. Based on previous research with hearing non-signers, we expected to observe RH advantage for faces presented in the left visual field and conversely, LH advantage for words presented in the right visual field.
Preliminary ERP results:
Deaf signers and hearing non-signers showed a similar laterality pattern for N170 to words (left-lateralized) and to faces (bilateral). However, the scalp distributions for the laterality effects differed between the groups and might reflect unique organization of visual pathways in the occipito-temporal cortex for deaf signers.
Discrimination accuracy - behavioral results:
Stay tuned for more news and final results!
We investigated whether hemispheric organization of word and face recognition is uniquely shaped by sign language experience.
For typically developing hearing individuals, learning to read leads to LH-lateralization for words and may trigger subsequent RH-lateralization for faces. Hemispheric specialization for faces in RH may be contingent on prior lateralization for words in LH (Dundas, Plaut & Behrmann, 2014).
Deaf native users of American Sign Language (ASL) have distinct developmental experiences with both words and faces (e.g., the face conveys linguistic information).
What is the relationship between word and face processing for deaf native users of American Sign Language? How do distinct developmental experiences of deaf signers and hearing non-signers affect hemispheric organization for word and face processing?
In our preliminary report, we report data from 19 hearing non-signers and 23 deaf ASL signers made same-different judgments to pairs of words or faces (192 trials each), where the first stimulus was presented centrally and the second was presented to either the left (LH) or right hemisphere (RH). EEG was recorded to centrally presented words / faces and referenced to the average of all electrode sites.
In addition, we measured accuracy of discrimination between the central and lateralized words / faces. Based on previous research with hearing non-signers, we expected to observe RH advantage for faces presented in the left visual field and conversely, LH advantage for words presented in the right visual field.
Preliminary ERP results:
Deaf signers and hearing non-signers showed a similar laterality pattern for N170 to words (left-lateralized) and to faces (bilateral). However, the scalp distributions for the laterality effects differed between the groups and might reflect unique organization of visual pathways in the occipito-temporal cortex for deaf signers.
- Face processing: Deaf and hearing showed a bilateral N170 to faces. Deaf signers showed a slightly right-lateralized response to faces at temporal sites, but behaviorally the hearing non-signers showed a small RH advantage. ERP results suggest perhaps a more distributed circuit for face perception.
- Word processing: Both groups showed a left lateralized N170 response to words, but the asymmetry was somewhat larger for hearing non-signers. At temporal sites, deaf signers exhibited a more bilateral N170 response while hearing non-signers exhibited a strong, left-lateralized N170 response. This result might reflect phonological-orthographic integration in hearing, but not deaf, individuals.
Discrimination accuracy - behavioral results:
- Both groups show higher accuracy for words than faces (F (2, 82) = 29.3)
- LH bias for words but no RH bias for face processing; only hearing group approached significance
Stay tuned for more news and final results!