Scientific and Technical Journal

ELECTROTECHNIC AND COMPUTER SYSTEMS

ISSN Print 2221-3937
ISSN Online 2221-3805
ADAPTIVE MULTISTEP SELF-LEARNING PROCEDURE FOR SOLVING PRINCIPAL COMPONENT ANALYSIS TASK
Abstract:

In many tasks associated with the large data sets processing, often arises a problem of compression with minimal loss of information in order to select the most essential features that define the nature of the phenomenon under investigation, data visualization, their transmitting over channels with limited bandwidth, etc. For solving such tasks principal component analysis (PCA) is widely used. This task is, also known as algebraic eigenvalue problem, or Karhunen-Loeve transformation. In situation when data sequentially are fed to processing in on-line mode, their volume is not known beforehand, and the system that generates the data is non-stationary, the traditional algorithms that implement the method of principal components loses its effectiveness and adaptive procedures based on neural network technology have to be used. In this regards, multistep self-learning rule for adaptive linear associator designed for finding eigenvalues and eigenvectors of the correlation matrix data that sequentially are fed to processing in on-line mode had proposed. This rule is a generalization of D. Hebb and E. Oja algorithms, used for neural networks training, implementing the method of principal components.

Authors:
Keywords
DOI
References
  1. Cichocki, A., Unbehauen, R. (1993) Neural Networks for Optimization and Signal Processing. Stuttgart: Teubner, 526 p.
  2. Haykin, S. (1999) Neural Networks. A Comprehensive Foundation. Upper Saddle River, N.J.: Prentice Hall, Inc. p. 842.
  3. Ham, F.M., Kostanic, I. (2001) Principles of Neurocomputing for Science and Engineering. N.Y.:Mc Graw-Hill, Inc., 642 p.
  4. Oja, E. (1982) A simplified neuron model as a principal component analyzer. J. of Math. Biology. 15. pp. 267–273.
  5. Oja, E. (1992) Principal component, minor components, and linear neural networks. Neural Networks. 5. pp. 927–935.
  6. Vazan, M (1972) Stochastic approcsimation. M.: Myr, 289 p. (In Russian)
  7. Bodyanskiy, Ye.V., Pliss, I. P., Teslenko, N. O. (2006) Optimal learning algorithm of Oja’s neuron. Decision making theory. :Proc. of the Inter. сonf. Uzhgorod. pp.10-11. (in Ukrainian)
  8. Bodyanskiy, Ye.V., Pliss, I. P., Teslenko,  N. O. (2006) Modification of Oja’s neuron for non-stationary data analysis. Automation: problems, ideas and solutions.: Proc. of the Inter. сonf. Sebastopol. pp.17–21. (in Russian)
  9. Bodyanskiy, Ye.V., Deineko, A.O.,. Teslenko, N. O., Shalamov, M. O. (2011) Evolving cascade neural network for sequential principal component analysis and its learning. System tecnologies. Dnеpropetrovsk. №1(72), V. 2.         pp. 140–147. (in Russian)
  10. Suykens, J.A.K., Gestel, T.V., Brabanter,    J.D., Moor, B.D., Vandewalle, J. (2002) Least Squares Support Vector Machines. Singapore: World Scientific. p. 294.
Published:
Last download:
2017-11-21 22:52:37

[ © KarelWintersky ] [ All articles ] [ All authors ]
[ © Odessa National Polytechnic University, 2014. Any use of information from the site is possible only under the condition that the source link! ]
Яндекс.Метрика