Explainable Shapley-Based Allocation (Student Abstract)

Meir Nizri, Noam Hazon, Amos Azaria

פרסום מחקרי: פרק בספר / בדוח / בכנספרסום בספר כנסביקורת עמיתים


The Shapley value is one of the most important normative division scheme in cooperative game theory, satisfying basic axioms. However, some allocation according to the Shapley value may seem unfair to humans. In this paper, we develop an automatic method that generates intuitive explanations for a Shapley-based payoff allocation, which utilizes the basic axioms. Given a coalitional game, our method decomposes it to sub-games, for which it is easy to generate verbal explanations, and shows that the given game is composed of the sub-games. Since the payoff allocation for each sub-game is perceived as fair, the Shapley-based payoff allocation for the given game should seem fair as well. We run an experiment with 210 human participants and show that when applying our method, humans perceive Shapley-based payoff allocation as significantly more fair than when using a general standard explanation.

שפה מקוריתאנגלית
כותר פרסום המארחIAAI-22, EAAI-22, AAAI-22 Special Programs and Special Track, Student Papers and Demonstrations
מוציא לאורAssociation for the Advancement of Artificial Intelligence
מספר עמודים2
מסת"ב (אלקטרוני)1577358767, 9781577358763
סטטוס פרסוםפורסם - 30 יוני 2022
אירוע36th AAAI Conference on Artificial Intelligence, AAAI 2022 - Virtual, Online
משך הזמן: 22 פבר׳ 20221 מרץ 2022

סדרות פרסומים

שםProceedings of the 36th AAAI Conference on Artificial Intelligence, AAAI 2022


כנס36th AAAI Conference on Artificial Intelligence, AAAI 2022
עירVirtual, Online

ASJC Scopus subject areas

  • ???subjectarea.asjc.1700.1702???

טביעת אצבע

להלן מוצגים תחומי המחקר של הפרסום 'Explainable Shapley-Based Allocation (Student Abstract)'. יחד הם יוצרים טביעת אצבע ייחודית.

פורמט ציטוט ביבליוגרפי