Abstract
Detecting changes in data-streams is an important part of enhancing learning quality in dy-namic environments. We devise a procedure for detecting concept drifts in data-streams that re-lies on analyzing the empirical loss of learning algorithms. Our method is based on obtaining statistics from the loss distribution by reusing the data multiple times via resampling. We present theoretical guarantees for the proposed procedure based on the stability of the underlying learning algorithms. Experimental results show that the method has high recall and precision, and performs well in the presence of noise.
Original language | English |
---|---|
Pages (from-to) | 2682-2694 |
Number of pages | 13 |
Journal | Proceedings of Machine Learning Research |
Volume | 32 |
State | Published - 2014 |
Event | 31st International Conference on Machine Learning, ICML 2014 - Beijing, China Duration: 21 Jun 2014 → 26 Jun 2014 |
All Science Journal Classification (ASJC) codes
- Artificial Intelligence
- Computer Networks and Communications
- Software