Skip navigation

putin IS MURDERER

Please use this identifier to cite or link to this item: https://oldena.lpnu.ua/handle/ntb/52477
Title: Adaptive Kernel Data Streams Clustering Based on Neural Networks Ensembles in Conditions of Uncertainty About Amount and Shapes of Clusters
Authors: Zhernova, Polina
Deineko, Anastasiia
Bodyanskiy, Yevgeniy
Riepin, Vladyslav
Affiliation: Kharkiv National University of Radio Electronics
Bibliographic description (Ukraine): Adaptive Kernel Data Streams Clustering Based on Neural Networks Ensembles in Conditions of Uncertainty About Amount and Shapes of Clusters / Polina Zhernova, Anastasiia Deineko, Yevgeniy Bodyanskiy, Vladyslav Riepin // Data stream mining and processing : proceedings of the IEEE second international conference, 21-25 August 2018, Lviv. — Львів : Lviv Politechnic Publishing House, 2018. — P. 7–12. — (Big Data & Data Science Using Intelligent Approaches).
Bibliographic description (International): Adaptive Kernel Data Streams Clustering Based on Neural Networks Ensembles in Conditions of Uncertainty About Amount and Shapes of Clusters / Polina Zhernova, Anastasiia Deineko, Yevgeniy Bodyanskiy, Vladyslav Riepin // Data stream mining and processing : proceedings of the IEEE second international conference, 21-25 August 2018, Lviv. — Lviv Politechnic Publishing House, 2018. — P. 7–12. — (Big Data & Data Science Using Intelligent Approaches).
Is part of: Data stream mining and processing : proceedings of the IEEE second international conference, 2018
Conference/Event: IEEE second international conference "Data stream mining and processing"
Issue Date: 28-Feb-2018
Publisher: Lviv Politechnic Publishing House
Place of the edition/event: Львів
Temporal Coverage: 21-25 August 2018, Lviv
Keywords: clustering
X-means method
ensemble of neural networks
self-learning
T. Kohonen’s neural network
Number of pages: 6
Page range: 7-12
Start page: 7
End page: 12
Abstract: The neural network’s approach for data stream clustering task, that in online mode are fed to processing in assumption of uncertainty about amount and shapes of clusters, is proposed in the paper. The main idea of this approach is based on the kernel clustering and idea of neural networks ensembles, that consist of the T. Kohonen’s selforganizing maps. Each of the clustering neural networks consists of different number of neurons, where number of clusters is connected with the quality of these neurons. All ensemble members process information that sequentially is fed to the system in the parallel mode. Experimental results have proven the fact that the system under consideration could be used to solve a wide range of Data Mining tasks when data sets are processed in an online mode.
URI: https://ena.lpnu.ua/handle/ntb/52477
ISBN: © Національний університет „Львівська політехніка“, 2018
© Національний університет „Львівська політехніка“, 2018
Copyright owner: © Національний університет “Львівська політехніка”, 2018
URL for reference material: http://www.ics.uci.edu/mlearn/MLRepository.html
References (Ukraine): [1] G. Gan, Ch. Ma and J. Wu, Data Clustering: Theory, Algorithms and Applications. Philadelphia: SIAM, 2007.
[2] R. Xu and D. C. Wunsch, Clustering. IEEE Press Series on Computational Intelligence. Hoboken, NJ: John Wiley & Sons, Inc., 2009.
[3] C. C. Aggarwal and C. K. Reddy, Data Clustering. Algorithms and Application. Boca Raton: CRC Press, 2014.
[4] D. Pelleg, and A. Moor, “X-means: extending K-means with efficient estimation of the number of clusters,” 17th Int. Conf. on Machine Learning, Morgan Kaufmann, San Francisco, pp.727-730, 2000.
[5] T. Ishioka, “An expansion of X-means for automatically determining the optimal number of clusters,” 4th IASTED Int. Conf. Computational Intelligence, Calgary, Alberta, pp. 91-96, 2005.
[6] L. Rutkowski, Computational Intelligence. Methods and Techniques. Berlin-Heidelberg: Springer-Verlag, 2008.
[7] C. Mumford and L. Jain, Computational Intelligence. Collaboration, Fuzzy and Emergence. Berlin: Springer-Vergal, 2009.
[8] R. Kruse, C. Borgelt, F. Klawonn, C. Moewes, M. Steinbrecher and P. Held, Computational Intelligence. A Methodological Introduction. Berlin: Springer, 2013.
[9] K.-L. Du and M. N. S. Swamy, Neural Networks and Statistical Learning. London: Springer-Verlag, 2014.
[10] T. Kohonen, Self-Organizing Maps. Berlin: Springer-Verlag, 1995.
[11] A. Strehl, J. Ghosh, “Cluster ensembles – A knowledge reuse framework for combining multiple partitions,” Journal of Machine Learning Research, pp. 583-617, 2002.
[12] A. Topchy, A. K. Jain, and W. Punch, “Clustering ensembles: models of consensus and weak partitions,” IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 1866-1881, 2005.
[13] H. Alizadeh, B. Minaei-Bidgoli, and H. Parvin, “To improve the quality of cluster ensembles by selecting a subset of base clusters”, Journal of Experimental & Theoretical Artificial Intelligence, pp. 127-150, 2013.
[14] M. Charkhabi, T. Dhot, and S. A. Mojarad, “Cluster ensembles, majority vote, voter eligibility and privileged voters”, Int. Journal of Machine Learning and Computing, vol. 4, no. 3, pp. 275-278, 2014.
[15] Ye. V. Bodyanskiy, A. A. Deineko, P. Ye. Zhernova, and V. O. Riepin, “Adaptive modification of X-means method based on the ensemble of the T. Kohonen’s clustering neural networks,” VI Int. Sci. Conf. “Information Managements Systems and Technologies”, Odessa, Ukrane, pp. 202-204, 2017.
[16] J. C. Bezdek, J. Keller, R. Krishnapuram and N. Pal, Fuzzy Models and Algorithms for Pattern Recognition and Image Processing. The Handbook of Fuzzy Sets. Kluwer, Dordrecht, Netherlands: Springer, vol. 4, 1999.
[17] T. M. Cover, “Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition,” IEEE Trans. on Electronic Computers, no. 14, pp. 326-334, 1965.
[18] M. Girolami, “Mercer kernel-based clustering in feature space”, IEEE Trans. on Neural Networks, vol. 13, no. 3, pp. 780-784, 2002.
[19] D. MacDonald and C. Fyfe, “Clustering in data space and feature space,” ESANN'2002 Proc. European Symp. on Artificial Neural Networks, Bruges (Belgium), pp. 137-142, 2002.
[20] M. Girolami, “Mercer kernel-based clustering in feature space,” IEEE Trans. on Neural Networks, vol. 13, no. 3, pp. 780-784, 2002.
[21] F. Camastra, and A. Verri, “A novel kernel method for clustering,” IEEE Trans. on Pattern Analysis and Machine Intelligence, no. 5, pp. 801-805, 2005.
[22] Ye. V. Bodyanskiy, A. A. Deineko, and Y. V. Kutsenko, “On-line kernel clustering based on the general regression neural network and T. Kohonen’s self-organizing map,” Automatic Control and Computer Sciences, 51(1), pp. 55-62, 2017.
[23] D. L. Davies, and D. W. Bouldin, “A Cluster Separation Measure”, IEEE Transactions on Pattern Analysis and Machine Intelligence. No. 4, pp. 224-227, 1979.
[24] P. M. Murphy, and D. Aha, UCI Repository of machine learning databases. URL: http://www.ics.uci.edu/mlearn/MLRepository.html. Department of Information and Computer Science.
References (International): [1] G. Gan, Ch. Ma and J. Wu, Data Clustering: Theory, Algorithms and Applications. Philadelphia: SIAM, 2007.
[2] R. Xu and D. C. Wunsch, Clustering. IEEE Press Series on Computational Intelligence. Hoboken, NJ: John Wiley & Sons, Inc., 2009.
[3] C. C. Aggarwal and C. K. Reddy, Data Clustering. Algorithms and Application. Boca Raton: CRC Press, 2014.
[4] D. Pelleg, and A. Moor, "X-means: extending K-means with efficient estimation of the number of clusters," 17th Int. Conf. on Machine Learning, Morgan Kaufmann, San Francisco, pp.727-730, 2000.
[5] T. Ishioka, "An expansion of X-means for automatically determining the optimal number of clusters," 4th IASTED Int. Conf. Computational Intelligence, Calgary, Alberta, pp. 91-96, 2005.
[6] L. Rutkowski, Computational Intelligence. Methods and Techniques. Berlin-Heidelberg: Springer-Verlag, 2008.
[7] C. Mumford and L. Jain, Computational Intelligence. Collaboration, Fuzzy and Emergence. Berlin: Springer-Vergal, 2009.
[8] R. Kruse, C. Borgelt, F. Klawonn, C. Moewes, M. Steinbrecher and P. Held, Computational Intelligence. A Methodological Introduction. Berlin: Springer, 2013.
[9] K.-L. Du and M. N. S. Swamy, Neural Networks and Statistical Learning. London: Springer-Verlag, 2014.
[10] T. Kohonen, Self-Organizing Maps. Berlin: Springer-Verlag, 1995.
[11] A. Strehl, J. Ghosh, "Cluster ensembles – A knowledge reuse framework for combining multiple partitions," Journal of Machine Learning Research, pp. 583-617, 2002.
[12] A. Topchy, A. K. Jain, and W. Punch, "Clustering ensembles: models of consensus and weak partitions," IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 1866-1881, 2005.
[13] H. Alizadeh, B. Minaei-Bidgoli, and H. Parvin, "To improve the quality of cluster ensembles by selecting a subset of base clusters", Journal of Experimental & Theoretical Artificial Intelligence, pp. 127-150, 2013.
[14] M. Charkhabi, T. Dhot, and S. A. Mojarad, "Cluster ensembles, majority vote, voter eligibility and privileged voters", Int. Journal of Machine Learning and Computing, vol. 4, no. 3, pp. 275-278, 2014.
[15] Ye. V. Bodyanskiy, A. A. Deineko, P. Ye. Zhernova, and V. O. Riepin, "Adaptive modification of X-means method based on the ensemble of the T. Kohonen’s clustering neural networks," VI Int. Sci. Conf. "Information Managements Systems and Technologies", Odessa, Ukrane, pp. 202-204, 2017.
[16] J. C. Bezdek, J. Keller, R. Krishnapuram and N. Pal, Fuzzy Models and Algorithms for Pattern Recognition and Image Processing. The Handbook of Fuzzy Sets. Kluwer, Dordrecht, Netherlands: Springer, vol. 4, 1999.
[17] T. M. Cover, "Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition," IEEE Trans. on Electronic Computers, no. 14, pp. 326-334, 1965.
[18] M. Girolami, "Mercer kernel-based clustering in feature space", IEEE Trans. on Neural Networks, vol. 13, no. 3, pp. 780-784, 2002.
[19] D. MacDonald and C. Fyfe, "Clustering in data space and feature space," ESANN'2002 Proc. European Symp. on Artificial Neural Networks, Bruges (Belgium), pp. 137-142, 2002.
[20] M. Girolami, "Mercer kernel-based clustering in feature space," IEEE Trans. on Neural Networks, vol. 13, no. 3, pp. 780-784, 2002.
[21] F. Camastra, and A. Verri, "A novel kernel method for clustering," IEEE Trans. on Pattern Analysis and Machine Intelligence, no. 5, pp. 801-805, 2005.
[22] Ye. V. Bodyanskiy, A. A. Deineko, and Y. V. Kutsenko, "On-line kernel clustering based on the general regression neural network and T. Kohonen’s self-organizing map," Automatic Control and Computer Sciences, 51(1), pp. 55-62, 2017.
[23] D. L. Davies, and D. W. Bouldin, "A Cluster Separation Measure", IEEE Transactions on Pattern Analysis and Machine Intelligence. No. 4, pp. 224-227, 1979.
[24] P. M. Murphy, and D. Aha, UCI Repository of machine learning databases. URL: http://www.ics.uci.edu/mlearn/MLRepository.html. Department of Information and Computer Science.
Content type: Conference Abstract
Appears in Collections:Data stream mining and processing : proceedings of the IEEE second international conference

Files in This Item:
File Description SizeFormat 
2018_Zhernova_P-Adaptive_Kernel_Data_Streams_7-12.pdf393.02 kBAdobe PDFView/Open
2018_Zhernova_P-Adaptive_Kernel_Data_Streams_7-12__COVER.png551.87 kBimage/pngView/Open
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.