Monte Carlo likelihood inference for missing data models. We describe a Monte Carlo method to approximate the maximum likelihood estimate (MLE), when there are missing data and the observed data likelihood is not available in closed form. This method uses simulated missing data that are independent and identically distributed and independent of the observed data. Our Monte Carlo approximation to the MLE is a consistent and asymptotically normal estimate of the minimizer θ * of the Kullback-Leibler information, as both Monte Carlo and observed data sample sizes go to infinity simultaneously. Plug-in estimates of the asymptotic variance are provided for constructing confidence regions for θ * . We give logit-normal generalized linear mixed model examples, calculated using an R package.
Keywords for this software
References in zbMATH (referenced in 5 articles )
Showing results 1 to 5 of 5.
- Doss, Hani; Park, Yeonhee: An MCMC approach to empirical Bayes inference and Bayesian sensitivity analysis via empirical processes (2018)
- Geyer, Charles J.; Ridley, Caroline E.; Latta, Robert G.; Etterson, Julie R.; Shaw, Ruth G.: Local adaptation and genetic effects on fitness: Calculations for exponential family models with random effects (2013)
- Masarotto, Guido; Varin, Cristiano: Gaussian copula marginal regression (2012)
- Yang, Yiping; Xue, Liugen; Cheng, Weihu: Empirical likelihood for a partially linear model with covariate data missing at random (2009)
- Sung, Yun Ju; Geyer, Charles J.: Monte Carlo likelihood inference for missing data models (2007)