https://oldena.lpnu.ua/handle/ntb/46145
Title: | Hechth–Nielsen theorem for a modified neural network with diagonal synaptic connections |
Other Titles: | Теорема Хехт–Нільсена для модифікованої нейронної мережі з діагональними синаптичними зв’язками |
Authors: | Пелещак, Р. Литвин, В. Пелещак, І. Дорошенко, М. Оливко, Р. Peleshchak, R. Lytvyn, V. Peleshchak, I. Doroshenko, M. Olyvko, R. |
Affiliation: | Дрогобицький державний педагогічний університет ім. І. Франка Національний університет “Львівська політехніка” Ivan Franko Drogobych State Pedagogical University Lviv Polytechnic National University |
Bibliographic description (Ukraine): | Hechth–Nielsen theorem for a modified neural network with diagonal synaptic connections / R. Peleshchak, V. Lytvyn, I. Peleshchak, M. Doroshenko, R. Olyvko // Mathematical Modeling and Computing. — Lviv : Lviv Politechnic Publishing House, 2019. — Vol 6. — No 1. — P. 101–108. |
Bibliographic description (International): | Hechth–Nielsen theorem for a modified neural network with diagonal synaptic connections / R. Peleshchak, V. Lytvyn, I. Peleshchak, M. Doroshenko, R. Olyvko // Mathematical Modeling and Computing. — Lviv : Lviv Politechnic Publishing House, 2019. — Vol 6. — No 1. — P. 101–108. |
Is part of: | Mathematical Modeling and Computing, 1 (6), 2019 |
Issue: | 1 |
Issue Date: | 26-Feb-2019 |
Publisher: | Lviv Politechnic Publishing House |
Place of the edition/event: | Львів Lviv |
UDC: | 004.81 004.891.2 004.891.3 |
Keywords: | нейронна мережа діагоналізація матриці операція агрегування апроксимація функції neural network diagonalize the matrix aggregation operation approximation of function |
Number of pages: | 8 |
Page range: | 101-108 |
Start page: | 101 |
End page: | 108 |
Abstract: | У роботі запропоновано модифіковану тришарову нейронну мережу з архітектурою,
яка має тільки діагональні синаптичні зв’язки між нейронами, внаслідок чого
отримано трансформовану теорему Хехт–Нільсена. Така архітектура тришарової
нейронної мережі (m = 2n + 1 - кількість нейронів прихованого шару нейромережі,
n - кількість вхідних образів) дає змогу апроксимувати функцію від n змінних із
заданою точністю " > 0 за допомогою однієї операції агрегування. Тришарова нейронна
мережа, яка має як діагональні, так і недіагональні синаптичні зв’язки між
нейронами, апроксимує функцію від n змінних за допомогою двох операцій агрегування.
Крім цього, діагоналізація матриці синаптичних зв’язків приводить до зменшення
обчислювального ресурсу і відповідно до зменшення часу налаштування вагових
коефіцієнтів синаптичних зв’язків під час навчання нейронної мережі. The work suggests a modified three-layer neural network with architecture that has only the diagonal synaptic connections between neurons; as a result we obtain the transformed Hecht–Nielsen theorem. This architecture of a three-layer neural network (m = 2n + 1 is the number of neurons in the hidden layer of the neural network, n is the number of input signals) allows us to approximate the function of n variables, with the given accuracy " > 0, using one aggregation operation, whereas a three-layer neural network that has both diagonal and non-diagonal synaptic connections between neurons approximates the function of n variables by means of two aggregation operations. In addition, the matrix diagonalization of the synaptic connections leads to a decrease of computing resources and reduces the time of adjustment of the weight coefficients during the training of a neural network. |
URI: | https://ena.lpnu.ua/handle/ntb/46145 |
Copyright owner: | CMM IAPMM NAS © 2019 Lviv Polytechnic National University |
References (Ukraine): | 1. KolmogorovA.N. On the representation of continuous functions of several variables by superpositions of continuous functions of a smaller number of variables. DAN USSR. 108, 2 (1956). 2. KolmogorovA.N. On the representation of continuous functions of several variables by superpositions of continuous functions of one variable and addition. DAN USSR. 114, 953–956 (1957). 3. VitushkinA.G., KhenkinG.M. Linear superposition of functions. Russ. Math. Surv. 22, 77–125 (1967). 4. Slupecki J. The criterion of completeness of multivalent accounting system. C. R. Classe III. 32 (1–3), 102–109 (1939). 5. McCullochW. S., PittsW. A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics. 5 (4), 115–133 (1943). 6. Rosenblatt F. Principles of Neurodynamics: Perceptrons and the theory of brain mechanisms. Spartan Books. Washington, DC (1961). 7. MinskyM., Papert S. Perceptrons. Oxford, England: M.I.T. Press. (1969). 8. CybenkoG. Approximations by superpositions of sigmoidal functions. Math. Control Signals Systems. 2, 303–314 (1989). 9. FunahashiK. On the approximate realization of continuous mappsngs by neural networks. Neural Networks. 2 (3), 183–192 (1989). 10. HornickK., StinchcombeM., WhiteH. Multilayer feedforward networks are universal approximators. Neural Networks. 2 (5), 359–366 (1989). 11. Hecht-NielsenR. Kolmogorov’s mapping neural network existence theorem. IEEE First Annual Int. Conf. on Neural Networks, San Diego. 3, 11–13 (1987). 12. AlekseevD.V. Approximation of functions of several variables by neural networks. Fundamental and applied mathematics. 15 (3), 9–21 (2009). 13. PenroseR. Shadows of the Mind: A Search for the Missing Science of Consciousness. Oxford University Press, Inc. New York, NY, USA (1994). 14. LytvynV., VysotskaV., Peleshchak I., Rishnyak I., PeleshchakR. Time dependence of the output signal morphology for nonlinear oscillator neuron based on Van der Pol model. International Journal of Intelligent Systems and Applications. 4, 8–17 (2018). 15. PeleshchakR.M., LytvynV.V., Peleshchak I.R. Dynamics of a nonlinear oscillatory neuron under the action of an external non-stationary signal. Radioelectronics, computer science, management. 4, 97–105 (2017). 16. LytvynV., Peleshchak I., PeleshchakR. The compression of the input images in neural network that using method diagonalization the matrices of synaptic weight connections. 2017 2nd International Conference on Advanced Information and Communication Technologies (AICT). 66–70 (2017). 17. LytvynV., Peleshchak I., PeleshchakR. Increase the speed of detection and recognition of computer attacks in combined diagonalized neural networks. 2017 4th International Scientific-Practical Conference “Problems of infocommunications. Science and Technolohy”. 152–155 (2017). |
References (International): | 1. KolmogorovA.N. On the representation of continuous functions of several variables by superpositions of continuous functions of a smaller number of variables. DAN USSR. 108, 2 (1956). 2. KolmogorovA.N. On the representation of continuous functions of several variables by superpositions of continuous functions of one variable and addition. DAN USSR. 114, 953–956 (1957). 3. VitushkinA.G., KhenkinG.M. Linear superposition of functions. Russ. Math. Surv. 22, 77–125 (1967). 4. Slupecki J. The criterion of completeness of multivalent accounting system. C. R. Classe III. 32 (1–3), 102–109 (1939). 5. McCullochW. S., PittsW. A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics. 5 (4), 115–133 (1943). 6. Rosenblatt F. Principles of Neurodynamics: Perceptrons and the theory of brain mechanisms. Spartan Books. Washington, DC (1961). 7. MinskyM., Papert S. Perceptrons. Oxford, England: M.I.T. Press. (1969). 8. CybenkoG. Approximations by superpositions of sigmoidal functions. Math. Control Signals Systems. 2, 303–314 (1989). 9. FunahashiK. On the approximate realization of continuous mappsngs by neural networks. Neural Networks. 2 (3), 183–192 (1989). 10. HornickK., StinchcombeM., WhiteH. Multilayer feedforward networks are universal approximators. Neural Networks. 2 (5), 359–366 (1989). 11. Hecht-NielsenR. Kolmogorov’s mapping neural network existence theorem. IEEE First Annual Int. Conf. on Neural Networks, San Diego. 3, 11–13 (1987). 12. AlekseevD.V. Approximation of functions of several variables by neural networks. Fundamental and applied mathematics. 15 (3), 9–21 (2009). 13. PenroseR. Shadows of the Mind: A Search for the Missing Science of Consciousness. Oxford University Press, Inc. New York, NY, USA (1994). 14. LytvynV., VysotskaV., Peleshchak I., Rishnyak I., PeleshchakR. Time dependence of the output signal morphology for nonlinear oscillator neuron based on Van der Pol model. International Journal of Intelligent Systems and Applications. 4, 8–17 (2018). 15. PeleshchakR.M., LytvynV.V., Peleshchak I.R. Dynamics of a nonlinear oscillatory neuron under the action of an external non-stationary signal. Radioelectronics, computer science, management. 4, 97–105 (2017). 16. LytvynV., Peleshchak I., PeleshchakR. The compression of the input images in neural network that using method diagonalization the matrices of synaptic weight connections. 2017 2nd International Conference on Advanced Information and Communication Technologies (AICT). 66–70 (2017). 17. LytvynV., Peleshchak I., PeleshchakR. Increase the speed of detection and recognition of computer attacks in combined diagonalized neural networks. 2017 4th International Scientific-Practical Conference "Problems of infocommunications. Science and Technolohy". 152–155 (2017). |
Content type: | Article |
Appears in Collections: | Mathematical Modeling And Computing. – 2019. – Vol. 6, No. 1 |
File | Description | Size | Format | |
---|---|---|---|---|
2019v6n1_Peleshchak_R-Hechth-Nielsen_theorem_101-108.pdf | 927.85 kB | Adobe PDF | View/Open | |
2019v6n1_Peleshchak_R-Hechth-Nielsen_theorem_101-108__COVER.png | 436.84 kB | image/png | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.