Abstract
We examine the Bayes-consistency of a recently proposed 1-nearest-neighbor-based multiclass learning algorithm. This algorithm is derived from sample compression bounds and enjoys the statistical advantages of tight, fully empirical generalization bounds, as well as the algorithmic advantages of a faster runtime and memory savings. We prove that this algorithm is strongly Bayes-consistent in metric spaces with finite doubling dimension - the first consistency result for an efficient nearest-neighbor sample compression scheme. Rather surprisingly, we discover that this algorithm continues to be Bayes-consistent even in a certain infinite-dimensional setting, in which the basic measure-theoretic conditions on which classic consistency proofs hinge are violated. This is all the more surprising, since it is known that k-NN is not Bayes-consistent in this setting. We pose several challenging open problems for future research.
| Original language | English |
|---|---|
| Title of host publication | ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017) |
| Editors | Guyon, UV Luxburg, S Bengio, H Wallach, R Fergus, S Vishwanathan, R Garnett |
| Number of pages | 11 |
| State | Published - 2017 |
| Event | 31st Conference on Neural Information Processing Systems - Long Beach Convention Center, Long Beach, United States Duration: 4 Dec 2017 → 9 Dec 2017 Conference number: 31st |
Publication series
| Name | Advances in Neural Information Processing Systems |
|---|---|
| Volume | 30 |
| ISSN (Print) | 1049-5258 |
Conference
| Conference | 31st Conference on Neural Information Processing Systems |
|---|---|
| Abbreviated title | NIPS'17 |
| Country/Territory | United States |
| City | Long Beach |
| Period | 4/12/17 → 9/12/17 |
Fingerprint
Dive into the research topics of 'Nearest-Neighbor Sample Compression: Efficiency, Consistency, Infinite Dimensions'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver