Abstract
The enrichment of tabular datasets using external sources has gained significant attention in recent years. Existing solutions, however, either ignore external unstructured data completely or devise dataset-specific solutions. In this study, we proposed Few-Shot Transformer based Enrichment (FeSTE), a generic and robust framework for the enrichment of tabular datasets using unstructured data. By training over multiple datasets, our approach is able to develop generic models that can be applied to additional datasets with minimal training (i.e., few-shot). Our approach is based on an adaptation of BERT, for which we present a novel fine-tuning approach that reformulates the tuples of the datasets as sentences. Our evaluation, conducted on 17 datasets, shows that FeSTE is able to generate high quality features and significantly outperform existing fine-tuning solutions.
| Original language | American English |
|---|---|
| Title of host publication | ACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) |
| Editors | Smaranda Muresan, Preslav Nakov, Aline Villavicencio |
| Publisher | Association for Computational Linguistics (ACL) |
| Pages | 1577-1591 |
| Number of pages | 15 |
| ISBN (Electronic) | 9781955917216 |
| State | Published - 1 Jan 2022 |
| Event | 60th Annual Meeting of the Association for Computational Linguistics, ACL 2022 - Dublin, Ireland Duration: 22 May 2022 → 27 May 2022 https://aclanthology.org/2022.acl-long.0/ |
Publication series
| Name | Proceedings of the Annual Meeting of the Association for Computational Linguistics |
|---|---|
| Volume | 1 |
Conference
| Conference | 60th Annual Meeting of the Association for Computational Linguistics, ACL 2022 |
|---|---|
| Country/Territory | Ireland |
| City | Dublin |
| Period | 22/05/22 → 27/05/22 |
| Internet address |
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- Linguistics and Language
- Language and Linguistics
Fingerprint
Dive into the research topics of 'Few-Shot Tabular Data Enrichment Using Fine-Tuned Transformer Architectures'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver