Connecting Network Architecture and Network Computation (15w5158)

Arriving in Banff, Alberta Sunday, December 6 and departing Friday December 11, 2015


(Southern Methodist University)

(McGill University)

Brent Doiron (University of Pittsburgh)

(University of Waterloo)

(University of Houston)

(University of Washington)


Timeliness of the workshop: recent advances in three fieldsField 1: Network dynamicsEmpirical studies have shown that biological neural networks have graph structures that depart from standard random networks in interesting ways. For example, certain cell types only produce excitatory vs. inhibitory synapses --- which corresponds to restrictions on ``labels" of outgoing edges in connectivity graphs. Moreover, specific subgraphs (motifs) are overexpressed. These biological findings motivate the study of dynamics on networks with novel statistical features that have not been previously addressed in the mathematical physics and dynamical systems literature. Spectral analysis of random networks has focused on networks with symmetric, independent connections, whereas neural networks show clusters, hubs, and other statistical dependencies. These statistical features --- distinct from those found in other fields of network analysis, such as social or biochemical networks --- have been shown to generate dynamical features specific to neural activity, such as transient amplification and spontaneous activity. Moving beyond linear analysis, new methods are needed to study dynamics on high-dimensional heterogeneous networks which may be underspecified because of lack of data or because of ongoing neuromodulation, which transiently modifies network architecture. Field 2: Stochastic analysis of network activityAn accumulating body of evidence shows that the timescale of correlations between pairs of neural spike trains is shaped by the stimulus' structure and the behavioral context. Theoretical work has begun to focus on the cellular and circuit mechanisms that both determine and modulate correlation. However, the general applicability of these theories to real sensory systems is unclear. An additional complication is that spike train correlations may be measured at different timescales, ranging from a few (synchrony) to hundreds of milliseconds (co-variation of firing rates). Several recent studies report that a modulation of spike time synchrony does not imply a simultaneous and identical modulation of rate co-variation. Such shaping is thought to underlie important changes in the neural code, but the neural circuitry responsible is largely unknown. Recent results have shown that recurrent inhibition can give rise to neural activity with arbitrarily low correlations. However, recurrent inhibition also gives rise to synchrony and network oscillations. The timescale over which neurons in a population coordinate their activity is a critical dimension of any population code, and hence the mechanisms that shape this timescale are a central component of versatile population responses. Two of the organizers (Chacron and Doiron) actively collaborate towards developing a general theory that explains how interactions between feedforward and feedback circuitry shape correlations in a given realistic neural circuit.Field 3: Network computationA general computational framework has been developed by one of the organizers (Eliasmith) --- termed Neural Engineering --- by which arbitrary algebraic manipulations can be embedded in networks under a number of simplifying assumptions. Other groups have focused on theoretical analyses of network architectures that optimally implement another fundamental set of computations, the encoding of information over long timescales in the presence of noise and uncertainty. The role of heterogeneity in enabling context-dependent input-output transformations is another recent, surprising development in network computation. Regarding the goal-directed output of neural systems, theorists are now focusing on the implementation of optimal control algorithms by idealized, recurrent network dynamics. Another set of findings is that many neural computations are consistent with Bayesian inference --- that is, a neural population may maintain a probabilistic representation of a percept, with several competing interpretations whose relatively likelihoods are updated based on evidence and history. These models have been shown to be consistent with psychophysical data, and such computations can be implemented with realistic neural network models. Structure and aims of the workshopAs with our previous workshop at BIRS, we will ground this meeting with a single, unifying question, and structure talks and research interactions to connect different fields of applied mathematics and theoretical biology toward new answers. We will focus on: "What features of network architecture subserve specific steps in network-level storage of sensory information, statistical inference, and controlled dynamical output?"The first three days will feature talks on central results in each of the three mathematical areas highlighted above: network dynamics (day 1), stochastic analysis (day 2), and circuit computation (day 3). The fourth morning and night will feature talks on the most recent attempts to unite the fields, with a short hike in the daytime to reenergize and facilitate the exchange of ideas. As in our previous workshop, the final day will be dedicated to small-group brainstorming and the formation of new collaborations.