Hostname: page-component-89b8bd64d-46n74 Total loading time: 0 Render date: 2026-05-06T06:09:05.602Z Has data issue: false hasContentIssue false

The acquisition of the semantics of Japanese numeral classifiers: The methodological value of nonsense

Published online by Cambridge University Press:  26 January 2024

Maki KUBOTA*
Affiliation:
AcqVA Aurora Center, UiT the Arctic University of Norway
Yuko MATSUOKA
Affiliation:
School of Philosophy, Psychology, and Language Sciences, University of Edinburgh
Jason ROTHMAN
Affiliation:
AcqVA Aurora Center, UiT the Arctic University of Norway Centro de Investigación Nebrija en Cognición, University of Nebrija
*
Corresponding author: Maki Kubota; Email: makikubota5@gmail.com
Rights & Permissions [Opens in a new window]

Abstract

This study examined the acquisition of numeral classifiers in 120 monolingual Japanese children. Previous research has argued that the complex semantic system underlying classifiers is late acquired. Thus, we set out to determine the age at which Japanese children are able to extend the semantic properties of classifiers to novel items/situations. Participants completed a comprehension task with a mouse-tracking extension and a production task with nonce and familiar items. While the comprehension results showed ceiling effects on familiar and nonce items, age significantly modulated a difference in accuracy between familiar and nonce items in the production task. The findings suggest that the acquisition of the underlying semantic system is acquired much earlier than previously argued. Previously attested issues with Japanese classifier production in young(er) children are more likely to reflect accessing difficulties than indexing the underlying grammatical competence of the classifier system.

Information

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press
Figure 0

Figure 1. Japanese numeral classifier system (taken from Yamamoto & Keil, 2000, p.381).

Figure 1

Table 1. The number of participants in each group

Figure 2

Table 2. Full list of classifier items in the comprehension task

Figure 3

Figure 2. Illustration of the production task.

Figure 4

Figure 3. Illustration of the comprehension task.

Figure 5

Figure 4. Accuracy of the classifier types split by animacy and familiarity for production. Error bars indicate standard error.

Figure 6

Table 3. The output of the generalized linear mixed effects model for production accuracy

Figure 7

Figure 5. The plot of two-way interaction effects between age and familiarity.

Figure 8

Figure 6. The plot of three-way interaction effects between age, familiarity, and animacy.

Figure 9

Figure 7. Conditional Inference Tree for production accuracy.

Figure 10

Table 4. Descriptive statistics of non-target response type for familiar and nonce items

Figure 11

Table 5. The output of the poisson generalized linear mixed effects model for number of observations per non-target response (NTR) type

Figure 12

Table 6. The ten most common non-target responses that were categorized as using the wrong classifier (WC)

Figure 13

Figure 8. Accuracy and reaction time of the classifier types split by animacy and familiarity for comprehension. Error bars indicate standard error.

Figure 14

Table 7. The output of the generalized linear mixed effects model (Accuracy) and linear mixed effects model (RTs) for comprehension

Figure 15

Figure 9. Conditional Inference Tree for comprehension accuracy.

Figure 16

Figure 10. Mean mouse trajectories split by familiarity (Panel A) and animacy (Panel B).

Supplementary material: File

Kubota et al. supplementary material

Kubota et al. supplementary material
Download Kubota et al. supplementary material(File)
File 39.8 KB