Analyzing Transformers in Embedding Space

Guy Dar, Mor Geva, Ankit Gupta, Jonathan Berant

نتاج البحث: فصل من :كتاب / تقرير / مؤتمرمنشور من مؤتمرمراجعة النظراء

ملخص

Understanding Transformer-based models has attracted significant attention, as they lie at the heart of recent technological advances across machine learning. While most interpretability methods rely on running models over inputs, recent work has shown that an input-independent approach, where parameters are interpreted directly without a forward/backward pass is feasible for some Transformer parameters, and for two-layer attention networks. In this work, we present a conceptual framework where all parameters of a trained Transformer are interpreted by projecting them into the embedding space, that is, the space of vocabulary items they operate on. Focusing mostly on GPT-2 for this paper, we provide diverse evidence to support our argument. First, an empirical analysis showing that parameters of both pretrained and fine-tuned models can be interpreted in embedding space. Second, we present two applications of our framework: (a) aligning the parameters of different models that share a vocabulary, and (b) constructing a classifier without training by “translating” the parameters of a fine-tuned classifier to parameters of a different model that was only pretrained. Overall, our findings show that at least in part, we can abstract away model specifics and understand Transformers in the embedding space.

اللغة الأصليةالإنجليزيّة
عنوان منشور المضيفLong Papers
ناشرAssociation for Computational Linguistics (ACL)
الصفحات16124-16170
عدد الصفحات47
رقم المعيار الدولي للكتب (الإلكتروني)9781959429722
حالة النشرنُشِر - 2023
الحدث61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 - Toronto, كندا
المدة: ٩ يوليو ٢٠٢٣١٤ يوليو ٢٠٢٣

سلسلة المنشورات

الاسمProceedings of the Annual Meeting of the Association for Computational Linguistics
مستوى الصوت1

!!Conference

!!Conference61st Annual Meeting of the Association for Computational Linguistics, ACL 2023
الدولة/الإقليمكندا
المدينةToronto
المدة٩/٠٧/٢٣١٤/٠٧/٢٣

All Science Journal Classification (ASJC) codes

  • !!Computer Science Applications
  • !!Linguistics and Language
  • !!Language and Linguistics

بصمة

أدرس بدقة موضوعات البحث “Analyzing Transformers in Embedding Space'. فهما يشكلان معًا بصمة فريدة.

قم بذكر هذا