Cost efficient gradient boosting

Sven Peter, Ferran Diego, Fred A. Hamprecht, Boaz Nadler

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Many applications require learning classifiers or regressors that are both accurate and cheap to evaluate. Prediction cost can be drastically reduced if the learned predictor is constructed such that on the majority of the inputs, it uses cheap features and fast evaluations. The main challenge is to do so with little loss in accuracy. In this work we propose a budget-aware strategy based on deep boosted regression trees. In contrast to previous approaches to learning with cost penalties, our method can grow very deep trees that on average are nonetheless cheap to compute. We evaluate our method on a number of datasets and find that it outperforms the current state of the art by a large margin. Our algorithm is easy to implement and its learning time is comparable to that of the original gradient boosting. Source code is made available at http://github.com/svenpeter42/LightGBM-CEGB.

Original languageEnglish
Title of host publicationProceedings of the 31st International Conference on Neural Information Processing Systems (NIPS 2017)
EditorsUV Luxburg, I Guyon, S Bengio, H Wallach, R Fergus
Pages1550–1560
Number of pages11
StatePublished - Dec 2017
Event31st Conference on Neural Information Processing Systems - Long Beach Convention Center, Long Beach, United States
Duration: 4 Dec 20179 Dec 2017
Conference number: 31st

Publication series

NameAdvances in Neural Information Processing Systems
Volume30
ISSN (Print)1049-5258

Conference

Conference31st Conference on Neural Information Processing Systems
Abbreviated titleNIPS'17
Country/TerritoryUnited States
CityLong Beach
Period4/12/179/12/17

Fingerprint

Dive into the research topics of 'Cost efficient gradient boosting'. Together they form a unique fingerprint.

Cite this