Hostname: page-component-89b8bd64d-sd5qd Total loading time: 0 Render date: 2026-05-08T01:18:34.741Z Has data issue: false hasContentIssue false

Generalized Bayesian method for diagnostic classification models

Published online by Cambridge University Press:  03 January 2025

Kazuhiro Yamaguchi*
Affiliation:
University of Tsukuba, Tsukuba, Japan
Yanlong Liu
Affiliation:
University of Michigan, Ann Arbor, MI, USA
Gongjun Xu
Affiliation:
University of Michigan, Ann Arbor, MI, USA
*
Corresponding author: Kazuhiro Yamaguchi; Email: yamaguchi.kazuhir.ft@u.tsukuba.ac.jp
Rights & Permissions [Opens in a new window]

Abstract

This study extends the loss function-based parameter estimation method for diagnostic classification models proposed by Ma, de la Torre, et al. (2023, Psychometrika) to consider prior knowledge and uncertainty of sampling. To this end, we integrate the loss function-based estimation method with the generalized Bayesian method. We establish the consistency of attribute mastery patterns of the proposed generalized Bayesian method. The proposed generalized Bayesian method is compared in a simulation study and found to be superior to the previous nonparametric diagnostic classification method—a special case of the loss function-based method. Moreover, the proposed method is applied to real data and compared with previous parametric and nonparametric estimation methods. Finally, practical guidelines for the proposed method and future research directions are discussed.

Information

Type
Theory and Methods
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of Psychometric Society
Figure 0

Table 1 The four-attribute $\mathrm{Q}$-matrix

Figure 1

Table 2 The five-attribute $\mathrm{Q}$-matrix

Figure 2

Table 3 The average correlations of attribute mastery probabilities estimated by first and second halves of MCMC iterations after the burn-in period

Figure 3

Figure 1 Simulation results of the DINA data generation with four-attribute $\mathbf{Q}$-matrix conditions.

Figure 4

Figure 2 Simulation results of the DINA data generation with five-attribute $\mathbf{Q}$-matrix conditions.

Figure 5

Figure 3 Simulation results of the general DCM data generation with four-attribute Q-matrix conditions.

Figure 6

Figure 4 Simulation results of the general DCM data generation with five-attribute Q-matrix conditions.

Figure 7

Figure 5 Box plots of attribute mastery probabilities of the DINA data generation with four-attribute Q-matrix conditions.

Figure 8

Figure 6 Box plots of attribute mastery probabilities of the DINA data generation with five-attribute Q-matrix conditions.

Figure 9

Figure 7 Box plots of attribute mastery probabilities of the general data generation with four-attribute $\mathbf{Q}$-matrix conditions.

Figure 10

Figure 8 Box plots of attribute mastery probabilities of the general data generation with five-attribute Q-matrix conditions.

Figure 11

Table 4 The $\mathrm{Q}$-matrix of ECPE data

Figure 12

Table 5 Means and SDs of posterior attribute mastery probabilities for GBGNPC and GBNPC methods

Figure 13

Table 6 Frequencies and ratios of the estimated attribute mastery patterns with the four estimation methods

Figure 14

Table 7 Contingency table of the estimated attribute mastery patterns by GBGNPC and GNPC

Figure 15

Table 8 Contingency table of the estimated attribute mastery patterns by GBNPC and NPC

Figure 16

Table 9 Individual differences in estimated patterns for GBGNPC and GNPC methods, response patterns, sum- and subscores, and attribute mastery probabilities

Figure 17

Table 10 Generalized posterior of attribute mastery pattern by GBNPC and NPC