Recent research shows that amortized variational inference (AVI) can be used to efficiently estimate high-dimensional latent variable models on large datasets. However, its use has remained limited to item response theory (IRT), and generalizing the approach to discrete latent variable models is not straightforward. We propose two ways to deal with this problem. In an initial simulation, we verify that these approaches can be used to estimate simple discrete latent variable models, such as latent class analysis and the generalized deterministic inputs, noisy and gate model. In these cases, AVI provides accurate parameter estimates, although the computational advantage over marginal maximum likelihood (MML) and standard variational inference (VI) is limited. We then apply the same approach to estimate mixture IRT models. In this case, AVI is computationally faster than MML estimation and standard VI. To demonstrate the practical applicability of our AVI approach, we use it to fit a seven-dimensional mixture IRT model to a narcissism inventory. Whereas quadrature-based methods cannot feasibly estimate models of this dimensionality, the efficient AVI approach even allows for computation of bootstrapped standard errors. We provide our code, along with an easy-to-use tool for fitting these models to new datasets.