Abstract
Neural networks have long strived to emulate the learning capabilities of the human brain. While deep neural networks (DNNs) draw inspiration from the brain in neuron design, their training methods diverge from biological foundations. Backpropagation, the primary training method for DNNs, requires substantial computational resources and fully labeled datasets, presenting major bottlenecks in development and application. This work demonstrates that by returning to biomimicry, specifically mimicking how the brain learns through pruning, we can solve various classical machine learning problems while utilizing orders of magnitude fewer computational resources and no labels. Our experiments successfully personalized multiple speech recognition and image classification models, including ResNet50 on ImageNet, resulting in an increased sparsity of approximately 70% while simultaneously improving model accuracy to around 90%, all without the limitations of backpropagation. This biologically inspired approach offers a promising avenue for efficient, personalized machine learning models in resource-constrained environments.
Original language | English |
---|---|
Article number | 101242 |
Journal | Patterns |
Volume | 6 |
Issue number | 5 |
DOIs | |
State | Published - 9 May 2025 |
Keywords
- biologically feasible learning
- biomimicry
- embedded learning
- lot machine learning
- machine learning
- neurosynaptic pruning
All Science Journal Classification (ASJC) codes
- General Decision Sciences