Backpropagation through time” (BPTT) for recurrent networks is widely used in machine learning for training recurrent networks on sequential processing tasks. BPTT “unfolds” a recurrent network across multiple discrete time steps and then runs backpropagation on the unfolded network to assign credit to particular units at particular time steps.
Alternatively models based on versions of Hebbian plasticity may be useful. These can give rise to different forms of correlation and competition between neurons, leading to the self-organized formation of ocular dominance columns, self-organizing maps and orientation columns…To generate complex temporal patterns, the brain may implement other forms of learning that do not require any equivalent of full backpropagation through a multilayer network….(Alternatively), the use of recurrent connections with multiple timescales can remove the need for backpropagation in the direct training of spiking recurrent networks. A H Marblestone 2016.
Researchers have observed that recurrent neural networks, with the kind of reverberations necessary for short-term memory, may play a central role in consciousness, and that a different kind of recurrence and training is required for short term memory than for longer term associative memory of settling down in image processing. (It is not said) that recurrence in the brain is only of the time-delayed kind, but clocks and backwards passes turn out to be necessary for that kind, and for hybrid systems which include that kind of capability. P J Werbos 2016.
We are beginning to understand the connections between the temporal dynamics of biologically realistic networks, and mechanisms of temporal and spatial credit assignment.
Biological Timers and Clocks in the Brain