Abstract
A key feature of neural network architectures is their ability to support the simultaneous interaction among large numbers of units in the learning and processing of representations. However, how the richness of such interactions trades off against the ability of a network to simultaneously carry out multiple independent processes - a salient limitation in many domains of human cognition - remains largely unexplored. In this paper we use a graph-theoretic analysis of network architecture to address this question, where tasks are represented as edges in a bipartite graph G = (A ⊃ B,E). We define a new measure of multitasking capacity of such networks, based on the assumptions that tasks that need to be multitasked rely on independent resources, i.e., form a matching, and that tasks can be multitasked without interference if they form an induced matching. Our main result is an inherent tradeoff between the multitasking capacity and the average degree of the network that holds regardless of the network architecture. These results are also extended to networks of depth greater than 2. On the positive side, we demonstrate that networks that are random-like (e.g., locally sparse) can have desirable multitasking properties. Our results shed light into the parallel-processing limitations of neural systems and provide insights that may be useful for the analysis and design of parallel architectures.
Original language | English |
---|---|
Pages (from-to) | 2101-2110 |
Number of pages | 10 |
Journal | Advances in Neural Information Processing Systems |
Volume | 2017-December |
State | Published - 2017 |
Event | 31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States Duration: 4 Dec 2017 → 9 Dec 2017 |
All Science Journal Classification (ASJC) codes
- Computer Networks and Communications
- Information Systems
- Signal Processing