DC Field | Value | Language |
dc.contributor.author | Pavlyshenko, Bohdan | |
dc.coverage.temporal | 21-25 August 2018, Lviv | |
dc.date.accessioned | 2020-06-19T12:05:23Z | - |
dc.date.available | 2020-06-19T12:05:23Z | - |
dc.date.created | 2018-02-28 | |
dc.date.issued | 2018-02-28 | |
dc.identifier.citation | Pavlyshenko B. Using Stacking Approaches for Machine Learning Models / Bohdan Pavlyshenko // Data stream mining and processing : proceedings of the IEEE second international conference, 21-25 August 2018, Lviv. — Львів : Lviv Politechnic Publishing House, 2018. — P. 255–258. — (Dynamic Data Mining & Data Stream Mining). | |
dc.identifier.isbn | © Національний університет „Львівська політехніка“, 2018 | |
dc.identifier.isbn | © Національний університет „Львівська політехніка“, 2018 | |
dc.identifier.uri | https://ena.lpnu.ua/handle/ntb/52503 | - |
dc.description.abstract | In this paper, we study the usage of stacking
approach for building ensembles of machine learning models.
The cases for time series forecasting and logistic regression have
been considered. The results show that using stacking technics
we can improve performance of predictive models in considered cases. | |
dc.format.extent | 255-258 | |
dc.language.iso | en | |
dc.publisher | Lviv Politechnic Publishing House | |
dc.relation.ispartof | Data stream mining and processing : proceedings of the IEEE second international conference, 2018 | |
dc.relation.uri | http://kaggle.com | |
dc.relation.uri | http://www.kaggle.com/c/rossmann-store-sales | |
dc.relation.uri | https://www.kaggle.com/c/grupo-bimbo-inventory-demand | |
dc.relation.uri | https://www.kaggle.com/c/grupo-bimbo-inventorydemand/discussion/23863 | |
dc.relation.uri | https://www.kaggle.com/bpavlyshenko/bimboxgboost-r-script-lb-0-457 | |
dc.relation.uri | https://www.kaggle.com/c/bosch-production-line-performance | |
dc.relation.uri | https://www.kaggle.com/c/boschproduction-line-performance/forums/t/24065/the-magical-feature-fromlb-0-3-to-0-4 | |
dc.relation.uri | https://www.kaggle.com/mmueller/bosch-production-lineperformance/road-2-0-4 | |
dc.relation.uri | https://www.kaggle.com/alexxanderlarko/boschproduction-line-performance/road-2-0-4-featureset | |
dc.relation.uri | http://sourceforge.net/projects/mcmcjags/files/Manuals/3.x/jags | |
dc.subject | machine learning | |
dc.subject | stacking | |
dc.subject | forecasting | |
dc.subject | classification | |
dc.subject | regression | |
dc.title | Using Stacking Approaches for Machine Learning Models | |
dc.type | Conference Abstract | |
dc.rights.holder | © Національний університет “Львівська політехніка”, 2018 | |
dc.contributor.affiliation | Ivan Franko National University of Lviv | |
dc.format.pages | 4 | |
dc.identifier.citationen | Pavlyshenko B. Using Stacking Approaches for Machine Learning Models / Bohdan Pavlyshenko // Data stream mining and processing : proceedings of the IEEE second international conference, 21-25 August 2018, Lviv. — Lviv Politechnic Publishing House, 2018. — P. 255–258. — (Dynamic Data Mining & Data Stream Mining). | |
dc.relation.references | [1] Kaggle: Your Home for Data Science. URL: http://kaggle.com | |
dc.relation.references | [2] B. M. Pavlyshenko. “Linear, machine learning and probabilistic approaches for time series analysis,” in IEEE First International Conference on Data Stream Mining & Processing (DSMP), Lviv, Ukraine, pp. 377-381, August 23-27, 2016. | |
dc.relation.references | [3] “Rossmann Store Sales”, Kaggle.Com, URL: http://www.kaggle.com/c/rossmann-store-sales . | |
dc.relation.references | [4] D. H. Wolpert. “Stacked generalization.” Neural networks, 5(2), pp. 241-259,1992. | |
dc.relation.references | [5] Kaggle competition “Grupo Bimbo Inventory Demand ” URL: https://www.kaggle.com/c/grupo-bimbo-inventory-demand | |
dc.relation.references | [6] Kaggle competition “Grupo Bimbo Inventory Demand” URL:https://www.kaggle.com/c/grupo-bimbo-inventorydemand/discussion/23863 | |
dc.relation.references | [7] T. Chen and C. Guestrin. “Xgboost: A scalable tree boosting system.” In Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining. ACM. 2016, pp. 785-794. | |
dc.relation.references | [8] Kaggle competition “Grupo Bimbo Inventory Demand” Bimbo XGBoost R script LB:0.457. URL: https://www.kaggle.com/bpavlyshenko/bimboxgboost-r-script-lb-0-457 | |
dc.relation.references | [9] J. Friedman. “Greedy function approximation: a gradient boosting machine.”, Annals of Statistics, 29(5):1189-1232, 2001. | |
dc.relation.references | [10] J. Friedman. “Stochastic gradient boosting.”, Computational Statistics & Data Analysis, 38(4):367-378, 2002. | |
dc.relation.references | [11] Kaggle competition “Bosch Production Line Performance”. URL: https://www.kaggle.com/c/bosch-production-line-performance | |
dc.relation.references | [12] B. Pavlyshenko. “Machine learning, linear and bayesian models for logistic regression in failure detection problems.,’ in IEEE International Conference on Big Data (Big Data), Washington D.C., USA, pp. 2046-2050, December 5-8, 2016. | |
dc.relation.references | [13] Kaggle competition “Bosch Production Line Performance”. The Magical Feature : from LB 0.3- to 0.4+. URL:https://www.kaggle.com/c/boschproduction-line-performance/forums/t/24065/the-magical-feature-fromlb-0-3-to-0-4 | |
dc.relation.references | [14] Kaggle competition “Bosch Production Line Performance”. Road2-0.4+. URL:https://www.kaggle.com/mmueller/bosch-production-lineperformance/road-2-0-4 | |
dc.relation.references | [15] Kaggle competition “Bosch Production Line Performance”. Road-2-0.4+ –>FeatureSet++. URL: https://www.kaggle.com/alexxanderlarko/boschproduction-line-performance/road-2-0-4-featureset | |
dc.relation.references | [16] John Kruschke. Doing Bayesian data analysis: A tutorial with R, JAGS, and Stan. Academic Press, 2014. | |
dc.relation.references | [17] Martyn Plummer. JAGS Version 3.4.0 user manual. URL:http://sourceforge.net/projects/mcmcjags/files/Manuals/3.x/jags user manual.pdf | |
dc.relation.referencesen | [1] Kaggle: Your Home for Data Science. URL: http://kaggle.com | |
dc.relation.referencesen | [2] B. M. Pavlyshenko. "Linear, machine learning and probabilistic approaches for time series analysis," in IEEE First International Conference on Data Stream Mining & Processing (DSMP), Lviv, Ukraine, pp. 377-381, August 23-27, 2016. | |
dc.relation.referencesen | [3] "Rossmann Store Sales", Kaggle.Com, URL: http://www.kaggle.com/c/rossmann-store-sales . | |
dc.relation.referencesen | [4] D. H. Wolpert. "Stacked generalization." Neural networks, 5(2), pp. 241-259,1992. | |
dc.relation.referencesen | [5] Kaggle competition "Grupo Bimbo Inventory Demand " URL: https://www.kaggle.com/c/grupo-bimbo-inventory-demand | |
dc.relation.referencesen | [6] Kaggle competition "Grupo Bimbo Inventory Demand" URL:https://www.kaggle.com/c/grupo-bimbo-inventorydemand/discussion/23863 | |
dc.relation.referencesen | [7] T. Chen and C. Guestrin. "Xgboost: A scalable tree boosting system." In Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining. ACM. 2016, pp. 785-794. | |
dc.relation.referencesen | [8] Kaggle competition "Grupo Bimbo Inventory Demand" Bimbo XGBoost R script LB:0.457. URL: https://www.kaggle.com/bpavlyshenko/bimboxgboost-r-script-lb-0-457 | |
dc.relation.referencesen | [9] J. Friedman. "Greedy function approximation: a gradient boosting machine.", Annals of Statistics, 29(5):1189-1232, 2001. | |
dc.relation.referencesen | [10] J. Friedman. "Stochastic gradient boosting.", Computational Statistics & Data Analysis, 38(4):367-378, 2002. | |
dc.relation.referencesen | [11] Kaggle competition "Bosch Production Line Performance". URL: https://www.kaggle.com/c/bosch-production-line-performance | |
dc.relation.referencesen | [12] B. Pavlyshenko. "Machine learning, linear and bayesian models for logistic regression in failure detection problems.,’ in IEEE International Conference on Big Data (Big Data), Washington D.C., USA, pp. 2046-2050, December 5-8, 2016. | |
dc.relation.referencesen | [13] Kaggle competition "Bosch Production Line Performance". The Magical Feature : from LB 0.3- to 0.4+. URL:https://www.kaggle.com/c/boschproduction-line-performance/forums/t/24065/the-magical-feature-fromlb-0-3-to-0-4 | |
dc.relation.referencesen | [14] Kaggle competition "Bosch Production Line Performance". Road2-0.4+. URL:https://www.kaggle.com/mmueller/bosch-production-lineperformance/road-2-0-4 | |
dc.relation.referencesen | [15] Kaggle competition "Bosch Production Line Performance". Road-2-0.4+ –>FeatureSet++. URL: https://www.kaggle.com/alexxanderlarko/boschproduction-line-performance/road-2-0-4-featureset | |
dc.relation.referencesen | [16] John Kruschke. Doing Bayesian data analysis: A tutorial with R, JAGS, and Stan. Academic Press, 2014. | |
dc.relation.referencesen | [17] Martyn Plummer. JAGS Version 3.4.0 user manual. URL:http://sourceforge.net/projects/mcmcjags/files/Manuals/3.x/jags user manual.pdf | |
dc.citation.conference | IEEE second international conference "Data stream mining and processing" | |
dc.citation.spage | 255 | |
dc.citation.epage | 258 | |
dc.coverage.placename | Львів | |
Appears in Collections: | Data stream mining and processing : proceedings of the IEEE second international conference
|