Abstract
Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve per¬formance means that resource consumption also grows. Such resources include data, time, storage, or energy, all of which are naturally limited and unevenly distributed. This mo¬tivates research into efficient methods that require fewer resources to achieve similar re¬sults. This survey synthesizes and relates cur¬rent methods and findings in efficient NLP. We aim to provide both guidance for con-ducting NLP under limited resources, and point towards promising research directions for developing more efficient methods.
| Original language | English |
|---|---|
| Pages (from-to) | 826-860 |
| Number of pages | 35 |
| Journal | Transactions of the Association for Computational Linguistics |
| Volume | 11 |
| DOIs | |
| State | Published - 2023 |
All Science Journal Classification (ASJC) codes
- Communication
- Human-Computer Interaction
- Linguistics and Language
- Computer Science Applications
- Artificial Intelligence