Hostname: page-component-6766d58669-kn6lq Total loading time: 0 Render date: 2026-05-14T21:28:41.334Z Has data issue: false hasContentIssue false

Problems with EM Algorithms for ML Factor Analysis

Published online by Cambridge University Press:  01 January 2025

P. M. Bentler*
Affiliation:
University of California, Los Angeles
Jeffrey S. Tanaka
Affiliation:
University of California, Los Angeles
*
Requests for reprints should be sent to P. M. Bentler, Department of Psychology, University of California, Los Angeles, California 90024.

Abstract

Rubin and Thayer recently presented equations to implement maximum likelihood (ML) estimation in factor analysis via the EM algorithm. They present an example to demonstrate the efficacy of the algorithm, and propose that their recovery of multiple local maxima of the ML function “certainly should cast doubt on the general utility of second derivatives of the log likelihood as measures of precision of estimation.” It is shown here, in contrast, that these second derivatives verify that Rubin and Thayer did not find multiple local maxima as claimed. The only known maximum remains the one found by Jöreskog over a decade earlier. The standard errors obtained from the second derivatives and the Fisher information matrix thus remain appropriate where ML assumptions are met. The advantages of the EM algorithm over other algorithms for ML factor analysis remain to be demonstrated.

Information

Type
Original Paper
Copyright
Copyright © 1983 The Psychometric Society

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Article purchase

Temporarily unavailable