Consolidating memory in natural recurrent neuronal networks
Consolidating memory in natural recurrent neuronal networks
Permanent Link
Description
Abstract
Neuronal networks provide living organisms with the ability to process information. While artificial neural networks, widely used nowadays in machine learning, are mostly based on a feedforward architecture, natural neuronal networks are characterized by abundant recurrent connections. Through recurrence, neuronal signals that propagate through the network eventually come back and affect the neurons that emitted them, giving rise to strong feedback that dominates the dynamics of the system. Such feedback provides recurrent neuronal networks with fading memory, which enables them to process time-dependent information, but decays quickly. The goal of this project is to study how this memory mechanisms emerge in a natural neuronal network such as the one from C. elegans and try to determine how long-term memory is stablished within the recurrent computation paradigm. To find a proper reservoir within C. elegans connectome, a pruning process was applied. Then, the obtained network was tested under different external conditions: without any input, with a constant input, with pulses and with a complex input. The study concluded that reservoir computing is a good approach to model natural neuronal networks and that C. elegans neuronal network implements long-term memory through reliability.Description
Treball de fi de grau en Biomèdica
Tutors: Jordi García Ojalvo, Òscar Vilarroya