Abstract
Existing algorithms for subgroup discovery with numerical targets do not optimize the error or target variable dispersion of the groups they find. This often leads to unreliable or inconsistent statements about the data, rendering practical applications, especially in scientific domains, futile. Therefore, we here extend the optimistic estimator framework for optimal subgroup discovery to a new class of objective functions: we show how tight estimators can be computed efficiently for all functions that are determined by subgroup size (non-decreasing dependence), the subgroup median value, and a dispersion measure around the median (non-increasing dependence). In the important special case when dispersion is measured using the mean absolute deviation from the median, this novel approach yields a linear time algorithm. Empirical evaluation on a wide range of datasets shows that, when used within branch-and-bound search, this approach is highly efficient and indeed discovers subgroups with much smaller errors.
| Original language | American English |
|---|---|
| Pages (from-to) | 1391-1418 |
| Number of pages | 28 |
| Journal | Data Mining and Knowledge Discovery |
| Volume | 31 |
| Issue number | 5 |
| DOIs | |
| State | Published - 1 Sep 2017 |
| Externally published | Yes |
Keywords
- Branch-and-bound search
- Local pattern discovery
- Subgroup discovery
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Computer Networks and Communications