Search

link to homepage

Institute of Neuroscience and Medicine
(leer)

Navigation and service


Publications of Theory of multi-scale neuronal networks

Selected publications of Theory of multi-scale neuronal networks

Optimal Sequence Memory in Driven Random Networks

Optimal Sequence Memory in Driven Random Networks

To perform complex tasks, our brains transform inputs in a complicated, nonlinear manner. Such transformations are implemented by large recurrent networks. Corresponding neural network models exhibit a transition to chaotic activity if the overall coupling strength between neurons is increased. This transition is believed to coincide with optimal information-processing capabilities, such as short-term memory. However, we show that this coincidence is not valid for networks receiving time-varying inputs. We theoretically analyze the stochastic nonlinear dynamics of randomly coupled neural networks in the presence of fluctuating inputs. We derive the dynamic mean-field theory using systematic methods from statistical physics. This approach reveals that fluctuating inputs shape the network's activity and suppress the emergence of chaos. We discover an unreported dynamical regime that amplifies perturbations on short time scales, but is not chaotic for long times. In this regime, networks optimally memorize their past inputs. Our work opens the study of recurrent neural networks to the rich and powerful set of methods developed in statistical physics. This approach will foster progress in the understanding of biological information processing and impact the design, control, and understanding of artificial neural networks. Schuecker J., Goedeke S., Helias M. Phys. Rev. X 8, 041029 (2018): Optimal Sequence Memory in Driven Random Networks …

Paper DD 20160816

Correlated Fluctuations in Strongly Coupled Binary Networks Beyond Equilibrium

Randomly coupled Ising spins constitute the classical model of collective phenomena in disordered systems, with applications covering glassy magnetism and frustration, combinatorial optimization, protein folding, stock market dynamics, and social dynamics. The phase diagram of these systems is obtained in the thermodynamic limit by averaging over the quenched randomness of the couplings. However, many applications require the statistics of activity for a single realization of the possibly asymmetric couplings in finite-sized networks. Examples include reconstruction of couplings from the observed dynamics, representation of probability distributions for sampling-based inference, and learning in the central nervous system based on the dynamic and correlation-dependent modification of synaptic connections. Dahmen D., Bos H., Helias M. Phys. Rev. X 6:031024. (2016): Correlated Fluctuations in Strongly Coupled Binary Networks Beyond Equilibrium …

Expansion of the effective action around non-Gaussian theories

Expansion of the effective action around non-Gaussian theories

The effective action or Gibbs free energy is of fundamental importance for statistical physics and field theory. So far, efficient methods for its computation existed for problems that can be decomposed into a Gaussian part and small perturbations around it. Kuehn et al. 2018 presents a general diagrammatic method that allows the perturbative expansion of the effective action. Applied to the Ising model, the work unifies the Plefka expansion, the high temperature expansion and Thouless-Anderson-Palmer mean-field theory; finally presenting its long searched-for diagrammatic derivation. Kühn T., Helias M. J Phys A 51(37) (2018): Expansion of the effective action around non-Gaussian theories …

Fundamental Activity...

Fundamental Activity Constraints Lead to Specific Interpretations of the Connectome

The continuous integration of experimental data into coherent models of the brain is an increasing challenge of modern neuroscience. Such models provide a bridge between structure and activity, and identify the mechanisms giving rise to experimental observations. Nevertheless, structurally realistic network models of spiking neurons are necessarily underconstrained even if experimental data on brain connectivity are incorporated to the best of our knowledge. Guided by physiological observations, any model must therefore explore the parameter ranges within the uncertainty of the data. Schuecker J., Schmidt M., van Albada SJ., Diesmann M., Helias M. PLoS Comput Biol 13(2): e1005179 (2017): Fundamental Activity Constraints Lead to Specific Interpretations of the Connectome …

Identifiying Anatomical...

Identifying Anatomical Origins of Coexisting Oscillations in the Cortical Microcircuit

Oscillations are omnipresent in neural population signals, like multi-unit recordings, EEG/MEG, and the local field potential. They have been linked to the population firing rate of neurons, with individual neurons firing in a close-to-irregular fashion at low rates. Using a combination of mean-field and linear response theory we predict the spectra generated in a layered microcircuit model of V1, composed of leaky integrate-and-fire neurons and based on connectivity compiled from anatomical and electrophysiological studies. The model exhibits low- and high-γ oscillations visible in all populations. Since locally generated frequencies are imposed onto other populations, the origin of the oscillations cannot be deduced from the spectra. We develop an universally applicable systematic approach that identifies the anatomical circuits underlying the generation of oscillations in a given network. Bos H., Diesmann M., Helias M. PLoS Comput Biol 12(10): e 1005132 (2016): Identifying Anatomical Origins of Coexisting Oscillations in the Cortical Microcircuit …

Complete list of peer-reviewed articles

Book Chapters


Servicemeu

Homepage