摘要:We prove minimax bounds for estimating Gaussian location mixtures on Rd under the squared L2 and the squared Hellinger loss functions. Under the squared L2 loss, we prove that the minimax optimal rate is upper and lower bounded by a constant multiple of n−1(logn)d∕2. Under the squared Hellinger loss, we consider two subclasses based on the behavior of the tails of the mixing measure. When the mixing measure has a sub-Gaussian tail, the minimax rate under the squared Hellinger loss is bounded from below by (logn)d∕n, which implies that the optimal minimax rate is between (logn)d∕n and the upper bound (logn)d+1∕n obtained by [11]. On the other hand, when the mixing measure is only assumed to have a bounded pth moment for a fixed p>0, the minimax rate under the squared Hellinger loss is bounded from below by n−p∕(p+d)(logn)−3d∕2. This rate is minimax optimal up to logarithmic factors.
关键词:almost parametric rate of convergence;Assouad’s lemma;curse of dimensionality;minimax lower bounds;multivariate normal location mixtures