Abstract
While much used in practice, latent variable models raise challenging estimation problems due to the intractability of their likelihood. Monte Carlo maximum likelihood (MCML), as proposed by Geyer & Thompson (1992), is a simulation-based approach to maximum likelihood approximation applicable to general latent variable models. MCML can be described as an importance sampling method in which the likelihood ratio is approximated by Monte Carlo averages of importance ratios simulated from the complete data model corresponding to an arbitrary value φ of the unknown parameter. This paper studies the asymptotic (in the number of observations) performance of the MCML method in the case of latent variable models with independent observations. This is in contrast with previous works on the same topic which only considered conditional convergence to the maximum likelihood estimator, for a fixed set of observations. A first important result is that when φ is fixed, the MCML method can only be consistent if the number of simulations grows exponentially fast with the number of observations. If on the other hand, φ is obtained from a consistent sequence of estimates of the unknown parameter, then the requirements on the number of simulations are shown to be much weaker.
| Original language | English |
|---|---|
| Pages (from-to) | 615-635 |
| Number of pages | 21 |
| Journal | Scandinavian Journal of Statistics |
| Volume | 29 |
| Issue number | 4 |
| DOIs | |
| Publication status | Published - 1 Jan 2002 |
Keywords
- Maximum likelihood estimation
- Monte Carlo maximum likelihood
- Simulated likelihood ratio
- Stochastic approximation
- Stochastic optimization