Relative entropy

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

Shannon’s entropy measure quantifies our ignorance of a system in terms of surprise and probability. The measure of relative entropy, or Kullback-Leibler divergence, quantifies our surprise when trying to code a posterior probability distribution P when using a code derived from a prior probability distribution Q. This measure is relevant for interdisciplinary research where crossing disciplinary boundaries requires methodological and conceptual bridges. I present three constructive usages of this measure in linguistics (e.g., the measure of semantic transparency in linguistic compounds), sport (i.e., modeling the behavior of a soccer team), and the study of human interactions (i.e., identifying significant romantic relations in a successful TV series) and conclude by pointing to further usages in interdisciplinary research.

Original languageAmerican English
Title of host publicationElgar Encyclopedia of Interdisciplinarity and Transdisciplinarity
EditorsFrédéric Darbellay
PublisherEdward Elgar Publishing Ltd.
Pages426-429
Number of pages4
ISBN (Electronic)9781035317967
ISBN (Print)9781035317950
DOIs
StatePublished - 1 Jan 2024

Keywords

  • Entropy
  • Interdisciplinary research
  • Kullnack-Leibler divergence
  • Relative entropy

All Science Journal Classification (ASJC) codes

  • General Social Sciences
  • General Arts and Humanities

Cite this