TY - JOUR
T1 - Random Tree Model of Meaningful Memory
AU - Zhong, Weishun
AU - Can, Tankut
AU - Georgiou, Antonis
AU - Shnayderman, Ilya
AU - Katkov, Mikhail
AU - Tsodyks, Misha
PY - 2025/6/13
Y1 - 2025/6/13
N2 - Traditional studies of memory for meaningful narratives focus on specific stories and their semantic structures but do not address common quantitative features of recall across different narratives. We introduce a statistical ensemble of random trees to represent narratives as hierarchies of key points, where each node is a compressed representation of its descendant leaves, which are the original narrative segments. Recall from this hierarchical representation is constrained by working memory capacity. Our analytical solution aligns with observations from large-scale narrative recall experiments. Specifically, our model explains that (1) average recall length increases sublinearly with narrative length and (2) individuals summarize increasingly longer narrative segments in each recall sentence. Additionally, the theory predicts that for sufficiently long narratives, a universal, scale-invariant limit emerges, where the fraction of a narrative summarized by a single recall sentence follows a distribution independent of narrative length.
AB - Traditional studies of memory for meaningful narratives focus on specific stories and their semantic structures but do not address common quantitative features of recall across different narratives. We introduce a statistical ensemble of random trees to represent narratives as hierarchies of key points, where each node is a compressed representation of its descendant leaves, which are the original narrative segments. Recall from this hierarchical representation is constrained by working memory capacity. Our analytical solution aligns with observations from large-scale narrative recall experiments. Specifically, our model explains that (1) average recall length increases sublinearly with narrative length and (2) individuals summarize increasingly longer narrative segments in each recall sentence. Additionally, the theory predicts that for sufficiently long narratives, a universal, scale-invariant limit emerges, where the fraction of a narrative summarized by a single recall sentence follows a distribution independent of narrative length.
UR - https://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=wis-pure&SrcAuth=WosAPI&KeyUT=WOS:001512796900010&DestLinkType=FullRecord&DestApp=WOS_CPL
U2 - 10.1103/g1cz-wk1l
DO - 10.1103/g1cz-wk1l
M3 - مقالة
SN - 0031-9007
VL - 134
JO - Physical review letters
JF - Physical review letters
IS - 23
M1 - 237402
ER -