首页    期刊浏览 2024年11月29日 星期五
登录注册

文章基本信息

  • 标题:Optimal Gaussian Approximations to the Posterior for Log-Linear Models with Diaconis–Ylvisaker Priors
  • 本地全文:下载
  • 作者:James Johndrow ; Anirban Bhattacharya
  • 期刊名称:Bayesian Analysis
  • 印刷版ISSN:1931-6690
  • 电子版ISSN:1936-0975
  • 出版年度:2018
  • 卷号:13
  • 期号:1
  • 页码:201-223
  • DOI:10.1214/16-BA1046
  • 语种:English
  • 出版社:International Society for Bayesian Analysis
  • 摘要:In contingency table analysis, sparse data is frequently encountered for even modest numbers of variables, resulting in non-existence of maximum likelihood estimates. A common solution is to obtain regularized estimates of the parameters of a log-linear model. Bayesian methods provide a coherent approach to regularization, but are often computationally intensive. Conjugate priors ease computational demands, but the conjugate Diaconis–Ylvisaker priors for the parameters of log-linear models do not give rise to closed form credible regions, complicating posterior inference. Here we derive the optimal Gaussian approximation to the posterior for log-linear models with Diaconis–Ylvisaker priors, and provide convergence rate and finite-sample bounds for the Kullback–Leibler divergence between the exact posterior and the optimal Gaussian approximation. We demonstrate empirically in simulations and a real data application that the approximation is highly accurate, even for modest sample sizes. We also propose a method for model selection using the approximation. The proposed approximation provides a computationally scalable approach to regularized estimation and approximate Bayesian inference for log-linear models.
  • 关键词:credible region; conjugate prior; contingency table;Dirichet–Multinomial; Kullback–Leibler divergence; Laplace approximation.
国家哲学社会科学文献中心版权所有