Skip navigation

putin IS MURDERER

Please use this identifier to cite or link to this item: https://oldena.lpnu.ua/handle/ntb/52477
Full metadata record
DC FieldValueLanguage
dc.contributor.authorZhernova, Polina
dc.contributor.authorDeineko, Anastasiia
dc.contributor.authorBodyanskiy, Yevgeniy
dc.contributor.authorRiepin, Vladyslav
dc.coverage.temporal21-25 August 2018, Lviv
dc.date.accessioned2020-06-19T12:04:59Z-
dc.date.available2020-06-19T12:04:59Z-
dc.date.created2018-02-28
dc.date.issued2018-02-28
dc.identifier.citationAdaptive Kernel Data Streams Clustering Based on Neural Networks Ensembles in Conditions of Uncertainty About Amount and Shapes of Clusters / Polina Zhernova, Anastasiia Deineko, Yevgeniy Bodyanskiy, Vladyslav Riepin // Data stream mining and processing : proceedings of the IEEE second international conference, 21-25 August 2018, Lviv. — Львів : Lviv Politechnic Publishing House, 2018. — P. 7–12. — (Big Data & Data Science Using Intelligent Approaches).
dc.identifier.isbn© Національний університет „Львівська політехніка“, 2018
dc.identifier.isbn© Національний університет „Львівська політехніка“, 2018
dc.identifier.urihttps://ena.lpnu.ua/handle/ntb/52477-
dc.description.abstractThe neural network’s approach for data stream clustering task, that in online mode are fed to processing in assumption of uncertainty about amount and shapes of clusters, is proposed in the paper. The main idea of this approach is based on the kernel clustering and idea of neural networks ensembles, that consist of the T. Kohonen’s selforganizing maps. Each of the clustering neural networks consists of different number of neurons, where number of clusters is connected with the quality of these neurons. All ensemble members process information that sequentially is fed to the system in the parallel mode. Experimental results have proven the fact that the system under consideration could be used to solve a wide range of Data Mining tasks when data sets are processed in an online mode.
dc.format.extent7-12
dc.language.isoen
dc.publisherLviv Politechnic Publishing House
dc.relation.ispartofData stream mining and processing : proceedings of the IEEE second international conference, 2018
dc.relation.urihttp://www.ics.uci.edu/mlearn/MLRepository.html
dc.subjectclustering
dc.subjectX-means method
dc.subjectensemble of neural networks
dc.subjectself-learning
dc.subjectT. Kohonen’s neural network
dc.titleAdaptive Kernel Data Streams Clustering Based on Neural Networks Ensembles in Conditions of Uncertainty About Amount and Shapes of Clusters
dc.typeConference Abstract
dc.rights.holder© Національний університет “Львівська політехніка”, 2018
dc.contributor.affiliationKharkiv National University of Radio Electronics
dc.format.pages6
dc.identifier.citationenAdaptive Kernel Data Streams Clustering Based on Neural Networks Ensembles in Conditions of Uncertainty About Amount and Shapes of Clusters / Polina Zhernova, Anastasiia Deineko, Yevgeniy Bodyanskiy, Vladyslav Riepin // Data stream mining and processing : proceedings of the IEEE second international conference, 21-25 August 2018, Lviv. — Lviv Politechnic Publishing House, 2018. — P. 7–12. — (Big Data & Data Science Using Intelligent Approaches).
dc.relation.references[1] G. Gan, Ch. Ma and J. Wu, Data Clustering: Theory, Algorithms and Applications. Philadelphia: SIAM, 2007.
dc.relation.references[2] R. Xu and D. C. Wunsch, Clustering. IEEE Press Series on Computational Intelligence. Hoboken, NJ: John Wiley & Sons, Inc., 2009.
dc.relation.references[3] C. C. Aggarwal and C. K. Reddy, Data Clustering. Algorithms and Application. Boca Raton: CRC Press, 2014.
dc.relation.references[4] D. Pelleg, and A. Moor, “X-means: extending K-means with efficient estimation of the number of clusters,” 17th Int. Conf. on Machine Learning, Morgan Kaufmann, San Francisco, pp.727-730, 2000.
dc.relation.references[5] T. Ishioka, “An expansion of X-means for automatically determining the optimal number of clusters,” 4th IASTED Int. Conf. Computational Intelligence, Calgary, Alberta, pp. 91-96, 2005.
dc.relation.references[6] L. Rutkowski, Computational Intelligence. Methods and Techniques. Berlin-Heidelberg: Springer-Verlag, 2008.
dc.relation.references[7] C. Mumford and L. Jain, Computational Intelligence. Collaboration, Fuzzy and Emergence. Berlin: Springer-Vergal, 2009.
dc.relation.references[8] R. Kruse, C. Borgelt, F. Klawonn, C. Moewes, M. Steinbrecher and P. Held, Computational Intelligence. A Methodological Introduction. Berlin: Springer, 2013.
dc.relation.references[9] K.-L. Du and M. N. S. Swamy, Neural Networks and Statistical Learning. London: Springer-Verlag, 2014.
dc.relation.references[10] T. Kohonen, Self-Organizing Maps. Berlin: Springer-Verlag, 1995.
dc.relation.references[11] A. Strehl, J. Ghosh, “Cluster ensembles – A knowledge reuse framework for combining multiple partitions,” Journal of Machine Learning Research, pp. 583-617, 2002.
dc.relation.references[12] A. Topchy, A. K. Jain, and W. Punch, “Clustering ensembles: models of consensus and weak partitions,” IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 1866-1881, 2005.
dc.relation.references[13] H. Alizadeh, B. Minaei-Bidgoli, and H. Parvin, “To improve the quality of cluster ensembles by selecting a subset of base clusters”, Journal of Experimental & Theoretical Artificial Intelligence, pp. 127-150, 2013.
dc.relation.references[14] M. Charkhabi, T. Dhot, and S. A. Mojarad, “Cluster ensembles, majority vote, voter eligibility and privileged voters”, Int. Journal of Machine Learning and Computing, vol. 4, no. 3, pp. 275-278, 2014.
dc.relation.references[15] Ye. V. Bodyanskiy, A. A. Deineko, P. Ye. Zhernova, and V. O. Riepin, “Adaptive modification of X-means method based on the ensemble of the T. Kohonen’s clustering neural networks,” VI Int. Sci. Conf. “Information Managements Systems and Technologies”, Odessa, Ukrane, pp. 202-204, 2017.
dc.relation.references[16] J. C. Bezdek, J. Keller, R. Krishnapuram and N. Pal, Fuzzy Models and Algorithms for Pattern Recognition and Image Processing. The Handbook of Fuzzy Sets. Kluwer, Dordrecht, Netherlands: Springer, vol. 4, 1999.
dc.relation.references[17] T. M. Cover, “Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition,” IEEE Trans. on Electronic Computers, no. 14, pp. 326-334, 1965.
dc.relation.references[18] M. Girolami, “Mercer kernel-based clustering in feature space”, IEEE Trans. on Neural Networks, vol. 13, no. 3, pp. 780-784, 2002.
dc.relation.references[19] D. MacDonald and C. Fyfe, “Clustering in data space and feature space,” ESANN'2002 Proc. European Symp. on Artificial Neural Networks, Bruges (Belgium), pp. 137-142, 2002.
dc.relation.references[20] M. Girolami, “Mercer kernel-based clustering in feature space,” IEEE Trans. on Neural Networks, vol. 13, no. 3, pp. 780-784, 2002.
dc.relation.references[21] F. Camastra, and A. Verri, “A novel kernel method for clustering,” IEEE Trans. on Pattern Analysis and Machine Intelligence, no. 5, pp. 801-805, 2005.
dc.relation.references[22] Ye. V. Bodyanskiy, A. A. Deineko, and Y. V. Kutsenko, “On-line kernel clustering based on the general regression neural network and T. Kohonen’s self-organizing map,” Automatic Control and Computer Sciences, 51(1), pp. 55-62, 2017.
dc.relation.references[23] D. L. Davies, and D. W. Bouldin, “A Cluster Separation Measure”, IEEE Transactions on Pattern Analysis and Machine Intelligence. No. 4, pp. 224-227, 1979.
dc.relation.references[24] P. M. Murphy, and D. Aha, UCI Repository of machine learning databases. URL: http://www.ics.uci.edu/mlearn/MLRepository.html. Department of Information and Computer Science.
dc.relation.referencesen[1] G. Gan, Ch. Ma and J. Wu, Data Clustering: Theory, Algorithms and Applications. Philadelphia: SIAM, 2007.
dc.relation.referencesen[2] R. Xu and D. C. Wunsch, Clustering. IEEE Press Series on Computational Intelligence. Hoboken, NJ: John Wiley & Sons, Inc., 2009.
dc.relation.referencesen[3] C. C. Aggarwal and C. K. Reddy, Data Clustering. Algorithms and Application. Boca Raton: CRC Press, 2014.
dc.relation.referencesen[4] D. Pelleg, and A. Moor, "X-means: extending K-means with efficient estimation of the number of clusters," 17th Int. Conf. on Machine Learning, Morgan Kaufmann, San Francisco, pp.727-730, 2000.
dc.relation.referencesen[5] T. Ishioka, "An expansion of X-means for automatically determining the optimal number of clusters," 4th IASTED Int. Conf. Computational Intelligence, Calgary, Alberta, pp. 91-96, 2005.
dc.relation.referencesen[6] L. Rutkowski, Computational Intelligence. Methods and Techniques. Berlin-Heidelberg: Springer-Verlag, 2008.
dc.relation.referencesen[7] C. Mumford and L. Jain, Computational Intelligence. Collaboration, Fuzzy and Emergence. Berlin: Springer-Vergal, 2009.
dc.relation.referencesen[8] R. Kruse, C. Borgelt, F. Klawonn, C. Moewes, M. Steinbrecher and P. Held, Computational Intelligence. A Methodological Introduction. Berlin: Springer, 2013.
dc.relation.referencesen[9] K.-L. Du and M. N. S. Swamy, Neural Networks and Statistical Learning. London: Springer-Verlag, 2014.
dc.relation.referencesen[10] T. Kohonen, Self-Organizing Maps. Berlin: Springer-Verlag, 1995.
dc.relation.referencesen[11] A. Strehl, J. Ghosh, "Cluster ensembles – A knowledge reuse framework for combining multiple partitions," Journal of Machine Learning Research, pp. 583-617, 2002.
dc.relation.referencesen[12] A. Topchy, A. K. Jain, and W. Punch, "Clustering ensembles: models of consensus and weak partitions," IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 1866-1881, 2005.
dc.relation.referencesen[13] H. Alizadeh, B. Minaei-Bidgoli, and H. Parvin, "To improve the quality of cluster ensembles by selecting a subset of base clusters", Journal of Experimental & Theoretical Artificial Intelligence, pp. 127-150, 2013.
dc.relation.referencesen[14] M. Charkhabi, T. Dhot, and S. A. Mojarad, "Cluster ensembles, majority vote, voter eligibility and privileged voters", Int. Journal of Machine Learning and Computing, vol. 4, no. 3, pp. 275-278, 2014.
dc.relation.referencesen[15] Ye. V. Bodyanskiy, A. A. Deineko, P. Ye. Zhernova, and V. O. Riepin, "Adaptive modification of X-means method based on the ensemble of the T. Kohonen’s clustering neural networks," VI Int. Sci. Conf. "Information Managements Systems and Technologies", Odessa, Ukrane, pp. 202-204, 2017.
dc.relation.referencesen[16] J. C. Bezdek, J. Keller, R. Krishnapuram and N. Pal, Fuzzy Models and Algorithms for Pattern Recognition and Image Processing. The Handbook of Fuzzy Sets. Kluwer, Dordrecht, Netherlands: Springer, vol. 4, 1999.
dc.relation.referencesen[17] T. M. Cover, "Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition," IEEE Trans. on Electronic Computers, no. 14, pp. 326-334, 1965.
dc.relation.referencesen[18] M. Girolami, "Mercer kernel-based clustering in feature space", IEEE Trans. on Neural Networks, vol. 13, no. 3, pp. 780-784, 2002.
dc.relation.referencesen[19] D. MacDonald and C. Fyfe, "Clustering in data space and feature space," ESANN'2002 Proc. European Symp. on Artificial Neural Networks, Bruges (Belgium), pp. 137-142, 2002.
dc.relation.referencesen[20] M. Girolami, "Mercer kernel-based clustering in feature space," IEEE Trans. on Neural Networks, vol. 13, no. 3, pp. 780-784, 2002.
dc.relation.referencesen[21] F. Camastra, and A. Verri, "A novel kernel method for clustering," IEEE Trans. on Pattern Analysis and Machine Intelligence, no. 5, pp. 801-805, 2005.
dc.relation.referencesen[22] Ye. V. Bodyanskiy, A. A. Deineko, and Y. V. Kutsenko, "On-line kernel clustering based on the general regression neural network and T. Kohonen’s self-organizing map," Automatic Control and Computer Sciences, 51(1), pp. 55-62, 2017.
dc.relation.referencesen[23] D. L. Davies, and D. W. Bouldin, "A Cluster Separation Measure", IEEE Transactions on Pattern Analysis and Machine Intelligence. No. 4, pp. 224-227, 1979.
dc.relation.referencesen[24] P. M. Murphy, and D. Aha, UCI Repository of machine learning databases. URL: http://www.ics.uci.edu/mlearn/MLRepository.html. Department of Information and Computer Science.
dc.citation.conferenceIEEE second international conference "Data stream mining and processing"
dc.citation.spage7
dc.citation.epage12
dc.coverage.placenameЛьвів
Appears in Collections:Data stream mining and processing : proceedings of the IEEE second international conference

Files in This Item:
File Description SizeFormat 
2018_Zhernova_P-Adaptive_Kernel_Data_Streams_7-12.pdf393.02 kBAdobe PDFView/Open
2018_Zhernova_P-Adaptive_Kernel_Data_Streams_7-12__COVER.png551.87 kBimage/pngView/Open
Show simple item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.