Hostname: page-component-89b8bd64d-nlwjb Total loading time: 0 Render date: 2026-05-06T05:34:47.782Z Has data issue: false hasContentIssue false

Activation of ASL signs during sentence reading for deaf readers: evidence from eye-tracking

Published online by Cambridge University Press:  26 April 2024

Emily Saunders*
Affiliation:
Department of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
Jonathan Mirault
Affiliation:
Pôle Pilote AMPIRIC, Institut National Supérieur du Professorat et de l’Éducation, Aix-Marseille Université, Marseille, France Laboratoire de Psychologie Cognitive, UMR 7290, Aix-Marseille Université & Centre National de la Recherche Scientifique, Marseille, France
Karen Emmorey
Affiliation:
Department of Speech, Language, and Hearing Sciences, San Diego State University, San Diego, CA, USA
*
Corresponding author: Emily Saunders; Email: ecsaunders@sdsu.edu
Rights & Permissions [Opens in a new window]

Abstract

Bilinguals activate both of their languages as they process written words, regardless of modality (spoken or signed); these effects have primarily been documented in single word reading paradigms. We used eye-tracking to determine whether deaf bilingual readers (n = 23) activate American Sign Language (ASL) translations as they read English sentences. Sentences contained a target word and one of the two possible prime words: a related prime which shared phonological parameters (location, handshape or movement) with the target when translated into ASL or an unrelated prime. The results revealed that first fixation durations and gaze durations (early processing measures) were shorter when target words were preceded by ASL-related primes, but prime condition did not impact later processing measures (e.g., regressions). Further, less-skilled readers showed a larger ASL co-activation effect. Together, the results indicate that ASL co-activation impacts early lexical access and can facilitate reading, particularly for less-skilled deaf readers.

Information

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NC
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial licence (http://creativecommons.org/licenses/by-nc/4.0), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original article is properly cited. The written permission of Cambridge University Press must be obtained prior to any commercial use.
Copyright
© The Author(s), 2024. Published by Cambridge University Press
Figure 0

Figure 1. (A) Semantically unrelated English word pair with phonologically related ASL translations. (B) A semantically related word pair with phonologically related ASL translations. Note: Images of ASL signs in this and subsequent figures are from the ASL-LEX database (Caselli et al., 2017; Sehyr et al., 2021).

Figure 1

Table 1. Summary of language assessment scores

Figure 2

Table 2. Descriptive statistics for the ASL-related and -unrelated word primes

Figure 3

Figure 2. Example of sentences containing ASL-related (top) and ASL-unrelated (bottom) word pairs.

Figure 4

Figure 3. Main effect of prime condition on early processing measures. Error bars reflect 95% confidence interval. Asterisks indicate a significant difference (|t| > 1.96).

Figure 5

Table 3. Descriptive statistics for eye-tracking measures

Figure 6

Table 4. Summary of r values between co-activation scores and PIAT-R scores

Figure 7

Figure 4. Correlation between reading skill and ASL co-activation scores for GD and go-past times (GP).

Figure 8

Table 5. Summary of non-significant correlations between ASL-SRT raw scores and co-activation