Abstract
We consider the problem of minimizing the number of misclassifications of a weighted voting classifier, plus a penalty proportional to the number of nonzero weights. We first prove that its optimum is at least as hard to approximate as the minimum disagreement halfspace problem for a wide range of penalty parameter values. After formulating the problem as a mixed integer program (MIP), we show that common "soft margin" linear programming (LP) formulations for constructing weighted voting classsifiers are equivalent to an LP relaxation of our formulation. We show that this relaxation is very weak, with a potentially exponential integrality gap. However, we also show that augmenting the relaxation with certain valid inequalities tightens it considerably, yielding a linear upper bound on the gap for all values of the penalty parameter that exceed a reasonable threshold. Unlike earlier techniques proposed for similar problems (Bradley and Mangasarian (1998) [4], Weston et al. (2003) [14]), our approach provides bounds on the optimal solution value.
| Original language | American English |
|---|---|
| Pages (from-to) | 481-486 |
| Number of pages | 6 |
| Journal | Information Processing Letters |
| Volume | 112 |
| Issue number | 12 |
| DOIs | |
| State | Published - 30 Jun 2012 |
Keywords
- Computational complexity
- Hardness of approximation
- Integrality gap
- Machine learning
- Sparsity
- Weighted voting classification
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- Signal Processing
- Information Systems
- Computer Science Applications
Fingerprint
Dive into the research topics of 'Sparse weighted voting classifier selection and its linear programming relaxations'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver