首页    期刊浏览 2024年12月05日 星期四
登录注册

文章基本信息

  • 标题:Efficient Hamiltonian Monte Carlo for large data sets by data subsampling
  • 本地全文:下载
  • 作者:Doan Khue Dung Dang
  • 期刊名称:Journal and Proceedings of the Royal Society of New South Wales
  • 印刷版ISSN:0035-9173
  • 出版年度:2019
  • 卷号:152
  • 期号:2
  • 页码:270-272
  • 语种:English
  • 出版社:The Society
  • 摘要:B ayesian statistics carries out inference statistical model using their posterior dis- about the unknown parameters in a tribution, which in many cases is computa- tionally intractable. Therefore, simulation methods such as Markov Chain Monte Carlo (MCMC) and Sequential Monte Carlo (SMC) are frequently used to approxi- mate the posterior distribution. SMC has the attractive ability to accurately estimate the marginal likelihood, although it is com- putationally more expensive than MCMC. Nevertheless, both methods require efficient Markov moves to deal with complex, high- dimensional problems. While Hamiltonian Monte Carlo (HMC) is a remedy in many cases, it also increases the computational cost of the algorithms appreciably, espe- cially for large data sets. This thesis presents some novel methods that focus on speed- ing up inference by combining HMC and data subsampling. The first contribution is a Metropolis-within-Gibbs algorithm that successfully speeds up standard HMC by orders of magnitude in two large data exam- ples. I then show that the new approach can be incorporated into other HMC implemen- tations such as the No-U-Turn sampler. The next contribution is an extension of the first method to SMC for Bayesian static models, which gives comparable results to full data SMC in terms of accuracy but is much faster.
国家哲学社会科学文献中心版权所有