期刊名称:International Journal of Advanced Computer Science and Applications(IJACSA)
印刷版ISSN:2158-107X
电子版ISSN:2156-5570
出版年度:2019
卷号:10
期号:3
页码:240-247
DOI:10.14569/IJACSA.2019.0100331
出版社:Science and Information Society (SAI)
摘要:Extreme Learning Machine (ELM) algorithm based on single hidden layer feedforward neural networks has shown as the best time series prediction technique. Furthermore, the algorithm has a good generalization performance with extremely fast learning speed. However, ELM facing overfitting problem that can affect the model quality due to the implementation using empirical risk minimization scheme. Therefore, this study aims to improve ELM by introducing an Activation Functions Regularization in ELM called RAF-ELM. The experiment has been conducted in two phases. First, investigating the modified RAF-ELM performance using four type of activation functions which is Sigmoid, Sine, Tribas and Hardlim. In this study, input weight and bias for hidden layers are randomly selected, whereas the best neurons number of hidden layer is determined from 5 to 100. This experiment used UCI benchmark datasets. The number of neurons (99) using Sigmoid activation function shown the best performance. The proposed methods has improved the accuracy performance and learning speed up to 0.016205 MAE and processing time 0.007 seconds respectively compared with conventional ELM and has improved up to 0.0354 MSE for accuracy performance compare with state of the art algorithm. The second experiment is to validate the proposed RAF-ELM using 15 regression benchmark dataset. RAF-ELM has been compared with four neural network techniques namely conventional ELM, Back Propagation, Radial Basis Function and Elman. The results show that RAF-ELM technique obtain the best performance compared to other techniques in term of accuracy for various time series data that come from various domain.
关键词:Extreme learning machine; prediction; neural networks; regularization; time series