Skip navigation

putin IS MURDERER

Please use this identifier to cite or link to this item: https://oldena.lpnu.ua/handle/ntb/52503
Title: Using Stacking Approaches for Machine Learning Models
Authors: Pavlyshenko, Bohdan
Affiliation: Ivan Franko National University of Lviv
Bibliographic description (Ukraine): Pavlyshenko B. Using Stacking Approaches for Machine Learning Models / Bohdan Pavlyshenko // Data stream mining and processing : proceedings of the IEEE second international conference, 21-25 August 2018, Lviv. — Львів : Lviv Politechnic Publishing House, 2018. — P. 255–258. — (Dynamic Data Mining & Data Stream Mining).
Bibliographic description (International): Pavlyshenko B. Using Stacking Approaches for Machine Learning Models / Bohdan Pavlyshenko // Data stream mining and processing : proceedings of the IEEE second international conference, 21-25 August 2018, Lviv. — Lviv Politechnic Publishing House, 2018. — P. 255–258. — (Dynamic Data Mining & Data Stream Mining).
Is part of: Data stream mining and processing : proceedings of the IEEE second international conference, 2018
Conference/Event: IEEE second international conference "Data stream mining and processing"
Issue Date: 28-Feb-2018
Publisher: Lviv Politechnic Publishing House
Place of the edition/event: Львів
Temporal Coverage: 21-25 August 2018, Lviv
Keywords: machine learning
stacking
forecasting
classification
regression
Number of pages: 4
Page range: 255-258
Start page: 255
End page: 258
Abstract: In this paper, we study the usage of stacking approach for building ensembles of machine learning models. The cases for time series forecasting and logistic regression have been considered. The results show that using stacking technics we can improve performance of predictive models in considered cases.
URI: https://ena.lpnu.ua/handle/ntb/52503
ISBN: © Національний університет „Львівська політехніка“, 2018
© Національний університет „Львівська політехніка“, 2018
Copyright owner: © Національний університет “Львівська політехніка”, 2018
URL for reference material: http://kaggle.com
http://www.kaggle.com/c/rossmann-store-sales
https://www.kaggle.com/c/grupo-bimbo-inventory-demand
https://www.kaggle.com/c/grupo-bimbo-inventorydemand/discussion/23863
https://www.kaggle.com/bpavlyshenko/bimboxgboost-r-script-lb-0-457
https://www.kaggle.com/c/bosch-production-line-performance
https://www.kaggle.com/c/boschproduction-line-performance/forums/t/24065/the-magical-feature-fromlb-0-3-to-0-4
https://www.kaggle.com/mmueller/bosch-production-lineperformance/road-2-0-4
https://www.kaggle.com/alexxanderlarko/boschproduction-line-performance/road-2-0-4-featureset
http://sourceforge.net/projects/mcmcjags/files/Manuals/3.x/jags
References (Ukraine): [1] Kaggle: Your Home for Data Science. URL: http://kaggle.com
[2] B. M. Pavlyshenko. “Linear, machine learning and probabilistic approaches for time series analysis,” in IEEE First International Conference on Data Stream Mining & Processing (DSMP), Lviv, Ukraine, pp. 377-381, August 23-27, 2016.
[3] “Rossmann Store Sales”, Kaggle.Com, URL: http://www.kaggle.com/c/rossmann-store-sales .
[4] D. H. Wolpert. “Stacked generalization.” Neural networks, 5(2), pp. 241-259,1992.
[5] Kaggle competition “Grupo Bimbo Inventory Demand ” URL: https://www.kaggle.com/c/grupo-bimbo-inventory-demand
[6] Kaggle competition “Grupo Bimbo Inventory Demand” URL:https://www.kaggle.com/c/grupo-bimbo-inventorydemand/discussion/23863
[7] T. Chen and C. Guestrin. “Xgboost: A scalable tree boosting system.” In Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining. ACM. 2016, pp. 785-794.
[8] Kaggle competition “Grupo Bimbo Inventory Demand” Bimbo XGBoost R script LB:0.457. URL: https://www.kaggle.com/bpavlyshenko/bimboxgboost-r-script-lb-0-457
[9] J. Friedman. “Greedy function approximation: a gradient boosting machine.”, Annals of Statistics, 29(5):1189-1232, 2001.
[10] J. Friedman. “Stochastic gradient boosting.”, Computational Statistics & Data Analysis, 38(4):367-378, 2002.
[11] Kaggle competition “Bosch Production Line Performance”. URL: https://www.kaggle.com/c/bosch-production-line-performance
[12] B. Pavlyshenko. “Machine learning, linear and bayesian models for logistic regression in failure detection problems.,’ in IEEE International Conference on Big Data (Big Data), Washington D.C., USA, pp. 2046-2050, December 5-8, 2016.
[13] Kaggle competition “Bosch Production Line Performance”. The Magical Feature : from LB 0.3- to 0.4+. URL:https://www.kaggle.com/c/boschproduction-line-performance/forums/t/24065/the-magical-feature-fromlb-0-3-to-0-4
[14] Kaggle competition “Bosch Production Line Performance”. Road2-0.4+. URL:https://www.kaggle.com/mmueller/bosch-production-lineperformance/road-2-0-4
[15] Kaggle competition “Bosch Production Line Performance”. Road-2-0.4+ –>FeatureSet++. URL: https://www.kaggle.com/alexxanderlarko/boschproduction-line-performance/road-2-0-4-featureset
[16] John Kruschke. Doing Bayesian data analysis: A tutorial with R, JAGS, and Stan. Academic Press, 2014.
[17] Martyn Plummer. JAGS Version 3.4.0 user manual. URL:http://sourceforge.net/projects/mcmcjags/files/Manuals/3.x/jags user manual.pdf
References (International): [1] Kaggle: Your Home for Data Science. URL: http://kaggle.com
[2] B. M. Pavlyshenko. "Linear, machine learning and probabilistic approaches for time series analysis," in IEEE First International Conference on Data Stream Mining & Processing (DSMP), Lviv, Ukraine, pp. 377-381, August 23-27, 2016.
[3] "Rossmann Store Sales", Kaggle.Com, URL: http://www.kaggle.com/c/rossmann-store-sales .
[4] D. H. Wolpert. "Stacked generalization." Neural networks, 5(2), pp. 241-259,1992.
[5] Kaggle competition "Grupo Bimbo Inventory Demand " URL: https://www.kaggle.com/c/grupo-bimbo-inventory-demand
[6] Kaggle competition "Grupo Bimbo Inventory Demand" URL:https://www.kaggle.com/c/grupo-bimbo-inventorydemand/discussion/23863
[7] T. Chen and C. Guestrin. "Xgboost: A scalable tree boosting system." In Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining. ACM. 2016, pp. 785-794.
[8] Kaggle competition "Grupo Bimbo Inventory Demand" Bimbo XGBoost R script LB:0.457. URL: https://www.kaggle.com/bpavlyshenko/bimboxgboost-r-script-lb-0-457
[9] J. Friedman. "Greedy function approximation: a gradient boosting machine.", Annals of Statistics, 29(5):1189-1232, 2001.
[10] J. Friedman. "Stochastic gradient boosting.", Computational Statistics & Data Analysis, 38(4):367-378, 2002.
[11] Kaggle competition "Bosch Production Line Performance". URL: https://www.kaggle.com/c/bosch-production-line-performance
[12] B. Pavlyshenko. "Machine learning, linear and bayesian models for logistic regression in failure detection problems.,’ in IEEE International Conference on Big Data (Big Data), Washington D.C., USA, pp. 2046-2050, December 5-8, 2016.
[13] Kaggle competition "Bosch Production Line Performance". The Magical Feature : from LB 0.3- to 0.4+. URL:https://www.kaggle.com/c/boschproduction-line-performance/forums/t/24065/the-magical-feature-fromlb-0-3-to-0-4
[14] Kaggle competition "Bosch Production Line Performance". Road2-0.4+. URL:https://www.kaggle.com/mmueller/bosch-production-lineperformance/road-2-0-4
[15] Kaggle competition "Bosch Production Line Performance". Road-2-0.4+ –>FeatureSet++. URL: https://www.kaggle.com/alexxanderlarko/boschproduction-line-performance/road-2-0-4-featureset
[16] John Kruschke. Doing Bayesian data analysis: A tutorial with R, JAGS, and Stan. Academic Press, 2014.
[17] Martyn Plummer. JAGS Version 3.4.0 user manual. URL:http://sourceforge.net/projects/mcmcjags/files/Manuals/3.x/jags user manual.pdf
Content type: Conference Abstract
Appears in Collections:Data stream mining and processing : proceedings of the IEEE second international conference

Files in This Item:
File Description SizeFormat 
2018_Pavlyshenko_B-Using_Stacking_Approaches_255-258.pdf239.3 kBAdobe PDFView/Open
2018_Pavlyshenko_B-Using_Stacking_Approaches_255-258__COVER.png549.8 kBimage/pngView/Open
Show full item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.