Abstract
Shannon’s entropy measure quantifies our ignorance of a system in terms of surprise and probability. The measure of relative entropy, or Kullback-Leibler divergence, quantifies our surprise when trying to code a posterior probability distribution P when using a code derived from a prior probability distribution Q. This measure is relevant for interdisciplinary research where crossing disciplinary boundaries requires methodological and conceptual bridges. I present three constructive usages of this measure in linguistics (e.g., the measure of semantic transparency in linguistic compounds), sport (i.e., modeling the behavior of a soccer team), and the study of human interactions (i.e., identifying significant romantic relations in a successful TV series) and conclude by pointing to further usages in interdisciplinary research.
| Original language | American English |
|---|---|
| Title of host publication | Elgar Encyclopedia of Interdisciplinarity and Transdisciplinarity |
| Editors | Frédéric Darbellay |
| Publisher | Edward Elgar Publishing Ltd. |
| Pages | 426-429 |
| Number of pages | 4 |
| ISBN (Electronic) | 9781035317967 |
| ISBN (Print) | 9781035317950 |
| DOIs | |
| State | Published - 1 Jan 2024 |
Keywords
- Entropy
- Interdisciplinary research
- Kullnack-Leibler divergence
- Relative entropy
All Science Journal Classification (ASJC) codes
- General Social Sciences
- General Arts and Humanities