Abstract
Training biophysical neuron models provides insights into brain circuits’ organization and problem-solving capabilities. Traditional training methods like backpropagation face challenges with complex models due to instability and gradient issues. We explore evolutionary algorithms (EA) combined with heterosynaptic plasticity as a gradient-free alternative. Our EA models agents with distinct neuron information routes, evaluated via alternating gating, and guided by dopamine-driven plasticity. This model draws inspiration from various biological mechanisms, such as dopamine function, dendritic spine meta-plasticity, memory replay, and cooperative synaptic plasticity within dendritic neighborhoods. Neural networks trained with this model recapitulate brain-like dynamics during cognition. Our method effectively trains spiking and analog neural networks in both feedforward and recurrent architectures, it also achieves performance in tasks like MNIST classification and Atari games comparable to gradient-based methods. Overall, this research extends training approaches for biophysical neuron models, offering a robust alternative to traditional algorithms.
Original language | English |
---|---|
Article number | 112340 |
Number of pages | 25 |
Journal | iScience |
Volume | 28 |
Issue number | 5 |
Early online date | 3 Apr 2025 |
DOIs | |
Publication status | Published - 16 May 2025 |
User-Defined Keywords
- Biophysical Neuron Model
- Evolutionary Algorithms
- Gradient-free Method
- Heterosynaptic Plasticity
- Neuroscience
- Biological sciences
- Biophysics