Hostname: page-component-cb9f654ff-lqqdg Total loading time: 0 Render date: 2025-09-09T15:36:31.826Z Has data issue: false hasContentIssue false
Accepted manuscript

Peirce in the Machine: How Mixture of Experts Models Perform Hypothesis Construction

Published online by Cambridge University Press:  03 September 2025

Bruce Rushing*
Affiliation:
University of Virginiam, USA
Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

Mixture of experts is a prediction aggregation method in machine learning that aggregates the predictions of specialized experts. This method often outperforms Bayesian methods despite the Bayesian having stronger inductive guarantees. We argue that this is due to the greater functional capacity of mixture of experts. We prove that in a limiting case of mixture of experts will have greater capacity than equivalent Bayesian methods, which we vouchsafe through experiments on non-limiting cases. Finally, we conclude that mixture of experts is a type of abductive reasoning in the Peircean sense of hypothesis construction.

Information

Type
Article
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of Philosophy of Science Association