摘要:It is well-known that the Kullback–Leibler support condition implies posterior consistency in the weak topology, but is not sufficient for consistency in the total variation distance. There is a counter–example. Since then many authors have proposed sufficient conditions for strong consistency; and the aim of the present paper is to introduce new conditions with specific application to nonparametric mixture models with heavy–tailed components, such as the Student-$t$. The key is a more focused result on sets of densities where if strong consistency fails then it fails on such densities. This allows us to move away from the traditional types of sieves currently employed.
关键词:Kullback–Leibler divergence;L´evy–Prokhorov metric;mixture of Student’s t distributions;posterior consistency;total variation.