Dynamic Quaternion Extreme Learning Machine

Abstract : Quaternion random neural network trained by extreme learning machine (Q-ELM) becomes attractive for its good learning capability and generalization performance in 3 or 4-dimensional (3/4-D) hypercomplex data learning. But how to determine the optimal network architecture is always challenging in Q-ELM. To this end, a novel error-minimization based Q-ELM (QEM-ELM) that only needs to optimize the output weights of the newly added neuron is developed in this paper. On this basis, a dynamic network construction scheme is further extended on Q-ELM, leading to a novel DQ-ELM, where the hidden nodes can be dynamically recruited or deleted according to the significance to network performance. The network parameters can be optimized and the architecture can be self-adapted simultaneously. Simulation results on many benchmark datasets demonstrate that the proposed QEM-ELM and DQ-ELM achieve good generalization performance by preserving a compact network size.
 EXISTING SYSTEM :
 ? It can be observed that, due to the enhanced ability in capturing the correlation between the three colour channels, the regularized QELM models outperformed the regularized ELM models. ? Furthermore, the regularized QELMAI model obtained the best recognition accuracy as equipped with the capability in capturing the full second order statistics of the input signals. ? For the sake of comparing with the state-of-the-art machine learning model in image recognition, the experiments with convolutional neural networks were also conducted, and we obtained recognition rates 93.6% and 100% for datasets Face96 and Grimace respectively.
 DISADVANTAGE :
 ? Predictive instability caused by randomly selecting the input weights and the hidden layer biases; ? Over-fitting problem caused by the complexity of distribution of input instances and much more hidden nodes on large datasets; ? The order of matrix H is N 9 M, where N is the number of samples, and M is the number of hidden layer nodes
 PROPOSED SYSTEM :
 ? The purpose of this paper is to propose and investigate two augmented QELM models for the regression and classification problems. ? In order to fully capture the second order statistics of signals, we incorporate the involutions of the input signals and the hidden nodes of the standard QELM respectively, and obtain two augmented QELM models, namely QELM with augmented input (QELMAI) and QELM with augmented hidden layer (QELMAH).
 ADVANTAGE :
 ? The performance of proposed method DEELM is compared with original ELM in three aspects, which are the influence of construction of sub-classifiers on ensemble system, average testing accuracy, stability. ? In the view of testing accuracy, the performance of ELM is superior to Bagging when the number of hidden nodes is greater than 73. ? For other datasets, the experimental results are similar. ? Based on the experimental results, we conclude that in the framework of ensemble SLFNNs trained with ELM AdaBoost is superior to Bagging, so in our method we prefer to select AdaBoost rather than Bagging.

We have more than 145000 Documents , PPT and Research Papers

Have a question ?

Mail us : info@nibode.com