Linking motif sequences with tale types by machine learning

Nir Ofek, Sándor Darányi, Lior Rokach

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


units of narrative content called motifs constitute sequences, also known as tale types. However whereas the dependency of tale types on the constituent motifs is clear, the strength of their bond has not been measured this far. Based on the observation that differences between such motif sequences are reminiscent of nucleotide and chromosome mutations in genetics, i.e., constitute "narrative DNA", we used sequence mining methods from bioinformatics to learn more about the nature of tale types as a corpus. 94% of the Aarne-Thompson-Uther catalogue (2249 tale types in 7050 variants) was listed as individual motif strings based on the Thompson Motif Index, and scanned for similar subsequences. Next, using machine learning algorithms, we built and evaluated a classifier which predicts the tale type of a new motif sequence. Our findings indicate that, due to the size of the available samples, the classification model was best able to predict magic tales, novelles and jokes.

Original languageAmerican English
Title of host publication2013 Workshop on Computational Models of Narrative, CMN 2013
PublisherSchloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing
Number of pages17
ISBN (Print)9783939897576
StatePublished - 1 Jan 2013
Event2013 Workshop on Computational Models of Narrative, CMN 2013 - Hamburg, Germany
Duration: 4 Aug 20136 Aug 2013

Publication series

NameOpenAccess Series in Informatics


Conference2013 Workshop on Computational Models of Narrative, CMN 2013


  • Machine learning
  • Motifs
  • Narrative DNA
  • Tale types
  • Type-motif correlation

All Science Journal Classification (ASJC) codes

  • Geography, Planning and Development
  • Modelling and Simulation


Dive into the research topics of 'Linking motif sequences with tale types by machine learning'. Together they form a unique fingerprint.

Cite this