Cross-Validated Variable Selection in Tree-Based Methods Improves Predictive Performance

Research output: Contribution to journalArticlepeer-review

Abstract

Recursive partitioning methods producing tree-like models are a long standing staple of predictive modeling. However, a fundamental flaw in the partitioning (or splitting) rule of commonly used tree building methods precludes them from treating different types of variables equally. This most clearly manifests in these methods' inability to properly utilize categorical variables with a large number of categories, which are ubiquitous in the new age of big data. We propose a framework to splitting using leave-one-out (LOO) cross validation (CV) for selecting the splitting variable, then performing a regular split (in our case, following CART's approach) for the selected variable. The most important consequence of our approach is that categorical variables with many categories can be safely used in tree building and are only chosen if they contribute to predictive power. We demonstrate in extensive simulation and real data analysis that our splitting approach significantly improves the performance of both single tree models and ensemble methods that utilize trees. Importantly, we design an algorithm for LOO splitting variable selection which under reasonable assumptions does not substantially increase the overall computational complexity compared to CART for two-class classification.

Original languageEnglish
Article number7776975
Pages (from-to)2142-2153
Number of pages12
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume39
Issue number11
DOIs
StatePublished - 1 Nov 2017

Keywords

  • Classification and regression trees
  • gradient boosting
  • random forests

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Cross-Validated Variable Selection in Tree-Based Methods Improves Predictive Performance'. Together they form a unique fingerprint.

Cite this