首页    期刊浏览 2025年02月28日 星期五
登录注册

文章基本信息

  • 标题:Minimum Message Length and Classical Methods for Model Selection in Univariate Polynomial Regression
  • 本地全文:下载
  • 作者:Viswanathan, Murlikrishna ; Yang, Young-Kyu ; WhangBo, Taeg-Keun
  • 期刊名称:ETRI Journal
  • 印刷版ISSN:1225-6463
  • 电子版ISSN:2233-7326
  • 出版年度:2005
  • 卷号:27
  • 期号:6
  • 页码:747-747
  • 语种:English
  • 出版社:Electronics and Telecommunications Research Institute
  • 摘要:The problem of selection among competing models has been a fundamental issue in statistical data analysis. Good fits to data can be misleading since they can result from properties of the model that have nothing to do with it being a close approximation to the source distribution of interest (for example, overfitting). In this study we focus on the preference among models from a family of polynomial regressors. Three decades of research has spawned a number of plausible techniques for the selection of models, namely, Akaike's Finite Prediction Error (FPE) and Information Criterion (AIC), Schwartz's criterion (SCH), Generalized Cross Validation (GCV), Wallace's Minimum Message Length (MML), Minimum Description Length (MDL), and Vapnik's Structural Risk Minimization (SRM). The fundamental similarity between all these principles is their attempt to define an appropriate balance between the complexity of models and their ability to explain the data. This paper presents an empirical study of the above principles in the context of model selection, where the models under consideration are univariate polynomials. The paper includes a detailed empirical evaluation of the model selection methods on six target functions, with varying sample sizes and added Gaussian noise. The results from the study appear to provide strong evidence in support of the MML- and SRM- based methods over the other standard approaches (FPE, AIC, SCH and GCV).
  • 关键词:Polynomial model selection;regression;penalization;minimum message length;MML
国家哲学社会科学文献中心版权所有