摘要:This article describes a Kernel Principal Component Regressor (KPCR) to identify Auto Regressive eXogenous (ARX) Linear Parmeter Varying (LPV) models. The new method differs from the Least Squares Support Vector Machines (LS-SVM) algorithm in the regularisa-tion of the Least Squares (LS) problem, since the KPCR only keeps the principal components of the Gram matrix while LS-SVM performs the inversion of the same matrix after adding a regularisation factor. Also, in this new approach, the LS problem is formulated in the primal space but it ends up being solved in the dual space overcoming the fact that the regressors are unknown.The method is assessed and compared to the LS-SVM approach through 2 Monte Carlo (MC) experiments. Every experiment consists of 100 runs of a simulated example, and a different noise level is used in each experiment, with Signal to Noise Ratios of 20db and 10db, respectively. The obtained results are twofold, first the performance of the new method is comparable to the LS-SVM, for both noise levels, although the required calculations are much faster for the KPCR. Second, this new method reduces the dimension of the primal space and may convey a way of knowing the number of basis functions required in the Kernel. Furthermore, having a structure very similar to LS-SVM makes it possible to use this method in other types of models, e.g. the LPV state-space model identification.
关键词:KeywordsSystem IdentificationLPV SystemsArxKernelKernel RegressionLeast squaresPrincipal ComponentsLeast Squares Support Vector Machines