Skip to main navigationSkip to main content
The University of Southampton
Mathematical Sciences

Localised structures in neural networks: from interface methods to multi-scale computations Seminar

Time:
12:00
Date:
23 February 2016
Venue:
54/5027 5A

Event details

Applied Mathematics Seminar

 

I will discuss the formation of coherent structures in spatially-extended neural networks posed on a ring or a torus. Phenomenological models of neural networks have been intensively studied in the past and are known to support a variety of coherent structures observed experimentally (localised bumps of activity, travelling fronts, travelling bumps, lurching waves, rotating waves). These models are typically written as integro-differential equations, where the integral term is a convolution between a synaptic kernel, specifying the neural anatomical connectivity, and a sigmoidal firing rate. Successful strategies for the analysis of these systems include special choices of the synaptic kernels (leading to equivalent PDE formulations) and interface methods. The latter are obtained via approximating the firing rates with Heaviside distributions. In this case, the evolution of the system is described entirely by the loci of points where the neural activity attains the firing rate threshold value.

In this talk, I will initially consider a 1D model with heterogeneous synaptic kernel and Heaviside firing rate and show that interface methods allow for the explicit construction of a bifurcation equation for localised steady states, so that analytical expressions for a classical "snakes and ladders” bifurcation scenario can be derived.

I will then consider a 2D model which does not admit an equivalent PDE formulation. In this (and other neural field models featuring a convolution structure) it is advantageous to combine FFT and Newton-Krylov solvers to perform numerical bifurcation analysis directly on the integral model. In particular, I will show that steep sigmoidal firing rates give rise to well-conditioned linear systems, thereby eliminating the need of a preconditioner in the Krylov step, even in large-dimensional discretisations of the network.

Finally, I will discuss how the insight gained in the Heaviside limit may be used to perform coarse-grained bifurcation analysis on neural networks, even in cases where the network does not evolve according to an integro-differential equation. As an illustrative example, a heterogeneous neural network in the form of a discrete Markov chain with discrete ternary state space, posed on a lattice, will be considered. The model supports coarse bumps, multi-bumps and travelling waves, but the derivation of a coarse evolution equation is nontrivial. Hence, the emerging states are followed in parameter space, using numerical coarse bifurcation analysis. At the core of the coarse bifurcation analysis algorithm is a lifting step which provides an ensemble of microscopic realisations compatible with a given macroscopic observable of the system. I will show that, by choosing the interface as the coarse variable, it is possible to construct efficient lifting steps for this system. Numerical results indicate that refractoriness in this network is responsible for the transition from stationary states to travelling states.

 

Speaker information

Daniele Avitabile , University of Nottingham. Lecturer in Applied Mathematics

Privacy Settings