# CNS*2014 Workshop on Methods of Information Theory in Computational Neuroscience

## Methods of Information Theory in Computational Neuroscience

### Wednesday and Thursday, July 30 and 31, 2014

### Québec City Convention Centre, Room 2101

### Québec City, Canada

## Overview

Methods originally developed in Information Theory have found wide applicability in computational neuroscience. Beyond these original methods there is a need to develop novel tools and approaches that are driven by problems arising in neuroscience.

A number of researchers in computational/systems neuroscience and in information/communication theory are investigating problems of information representation and processing. While the goals are often the same, these researchers bring different perspectives and points of view to a common set of neuroscience problems. Often they participate in different fora and their interaction is limited.

The goal of the workshop is to bring some of these researchers together to discuss challenges posed by neuroscience and to exchange ideas and present their latest work.

The workshop is targeted towards computational and systems neuroscientists with interest in methods of information theory as well as information/communication theorists with interest in neuroscience.

## References

- C.E. Shannon, A Mathematical Theory of Communication, Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, 1948.
- Milenkovic, O., Alterovitz, G., Battail, G., Coleman, T. P., et al., Eds., Special Issue on Molecular Biology and Neuroscience, IEEE Transactions on Information Theory, Volume 56, Number 2, February, 2010.
- Dimitrov, A.G., Lazar, A.A. and Victor, J.D., Information Theory in Neuroscience, Journal of Computational Neuroscience, Vol. 30, No. 1, February 2011, pp. 1-5, Special Issue on Methods of Information Theory.

## Standing Committee

- Alex G. Dimitrov, Department of Mathematics, Washington State University - Vancouver.
- Aurel A. Lazar, Department of Electrical Engineering, Columbia University.

## Program Committee

- Michael C. Gastpar, Laboratory for Information in Networked Systems, EPFL and UC Berkeley.
- Conor Houghton, Department of Mathematics, Trinity College Dublin.
- Simon R. Schultz, Department of Bioengineering, Imperial College.
- Tatyana O. Sharpee, The Computational Neurobiology Laboratory, Salk Institute.

## Program Overview

**Wednesday, 9:00 AM - 5:00 PM, July 30, 2014**

**Morning Session I (9:00 AM - 10:30 PM)**

**Chair: Alexander G. Dimitrov**

9:00 AM - 9:45 AM

**Information-theoretic stimulus design for neurophysiology and psychophysics**

Chris DiMattina, Department of Psychology, Florida Gulf Coast University

As neuroscientists and psychologists consider increasingly complex models relating stimulus parameters to neural and behavioral responses, it is becoming increasingly important to choose good stimulus sets to enable fast and accurate estimation of model parameters. One approach to designing stimuli known as information-theoretic stimulus design chooses stimuli at each trial which are expected to maximize the mutual information between the observed responses and the model parameters. In this talk, I will briefly summarize the growing body of work in this area and demonstrate one recent implementation of information-theoretic stimulus design which can be applied to efficiently estimating high-dimensional nonlinear models which cannot be easily identified using white noise stimuli. I will then describe a recent auditory neurophysiology experiment which applied this methodology to the problem of estimating and comparing nonlinear network models of spectral integration in the primate inferior colliculus. In the final part of the talk, I will discuss recent work in my laboratory aimed at finding efficient implementations of information-theoretic stimulus design for high-dimensional psychophysical experiments. We show how the standard grid-based implementation of the popular Psi method is impractical in high dimensions, and we demonstrate several novel implementations of the Psi method which are both fast and effective. Lastly, I will discuss future application of these novel methods to visual psychophysics experiments in my laboratory.

9:45 AM - 10:30 AM

**Efficient optimization of neural population codes**

Kechen Zhang, Department of Biomedical Engineering, Johns Hopkins University

To evaluate Shannon mutual information between sensory stimuli and the responses of a population of neurons, one needs to average over all possible response patterns of the population, which quickly leads to combinatorial explosion as the population size increases. Although one may use Monte Carlo simulations for a particular population configuration, this method is generally not suitable for studying optimality issues that require comparison of many hypothetical configurations because of the noise inherent in the approach. We recently found an efficient asymptotic formula for mutual information in the large population limit, which allows efficient numerical optimization of population distributions of tuning functions (Huang and Zhang, unpublished). We have applied this method to optimize several population models and obtained a range of interesting solutions, including clustering behaviors in the population distributions. The emergent optimal population distributions may account for several types of sensory neuron distributions known to exist in the sensory systems, providing further support for the optimal coding hypothesis in large neural populations.

**Morning Break 10:30 AM - 11:00 AM**

**Morning Session II (11:00 AM - 12:15 PM)**

**Chair: Alexander G. Dimitrov**

11:00 AM - 11:45 AM

**Coding of second order stimulus features in the electrosensory system**

Maurice Chacron, Faculty of Medicine, McGill University

Understanding how the brain processes natural sensory stimuli remains an important question in systems neuroscience. This understanding is complicated input often displays rich spatiotemporal structure characterized by first (e.g. luminance) and second (e.g. contrast) order attributes that vary independently of one another. Here I will focus on describing recent progress made towards understanding the nature of the neural circuits that are dedicated to processing the first and second order attributes of natural electrosensory stimuli experienced by weakly electric fish. I will first focus on how heterogeneous peripheral and first order hindbrain sensory neuron populations differentially processes both first and second order attributes. I will then move on to understand how segregation of information about first and second order attributes is achieved in the midbrain through balanced input from both ON and OFF hype hindbrain neurons. Finally, I will present evidence that weakly electric fish give behavioral responses to second order electrosensory stimulus attributes and that these are matched to natural statistics, giving rise to the interesting hypothesis that these attributes are optimally processed by the electrosensory brain.

11:45 AM - 12:15 PM

**DISCUSSION**

**Lunch 12:15 PM - 2:00 PM**

**Afternoon Session I (2:00 AM - 3:30 PM)**

**Chair: Conor Houghton**

2:00 PM - 2:45 PM

**Noise-enhanced associative memory recall and other problems in faulty
information processing**

Lav R. Varshney, Department of Electrical and Computer Engineering, UIUC

Recent advances in associative memory design through structured pattern sets and graph-based inference algorithms have allowed reliable learning and recall of an exponential number of patterns. Although these designs correct external errors in recall, they assume neurons that compute noiselessly, in contrast to the highly variable neurons in brain regions thought to operate associatively such as hippocampus and olfactory cortex. Here we consider associative memories with noisy internal computations and analytically characterize performance. As long as the internal noise level is below a specified threshold, the error probability in the recall phase can be made exceedingly small. More surprisingly, we show that internal noise actually improves the performance of the recall phase while the pattern retrieval capacity remains intact, i.e., the number of stored patterns does not reduce with noise (up to a threshold). Computational experiments lend additional support to our theoretical analysis. This work suggests a functional benefit to noisy neurons in biological neuronal networks. In closing we discuss related faulty graph-based inference algorithms for decoding error-correcting codes and for reconstructing visual receptive fields and cortical connectomes.

2:45 PM - 3:00 PM

**Weak stimulus detection and estimation through correlated adaptation dynamics**

Andre Longtin, Physics Department, University of Ottawa

Temporal spike patterns exhibiting interspike interval (ISI) correlations are a common feature of many neural systems, and reflect intrinsic adaptation processes. We establish that such an intracellular adaptation model can efficiently represent the spike pattern of the electroreceptor afferents of the weakly electric fish Apteronotus. This adaptation-based representation is formed by an invertible transformation of the correlated ISI sequence, thereby containing all the information of the spike pattern. This transformation is based on a sequence of independent molecular-like dynamic variables. The representation forms a statistically efficient encoding of the spike train, whereby the probability of any spike pattern can be readily computed and can be utilized for sensory decision making. A critical issue in sensory spike coding is how much useful information is contained in a spike pattern, including ISI correlations, beyond seemingly more reductive firing rate-based representations. The adaptation-based representation enables that stimulus estimation can be performed more effectively with patterns relative to firing rate, whereas stimulus detection performance is effectively equivalent.

Nesse W, Maler L, Longtin A, Biophysical information representation in correlated spike trains. Proc. Nat. Acad. Sciences (USA) 107 (51), 1973-1978, 2010.

Nesse W, Marsat G, Longtin A, Maler L (to be submitted)

**Afternoon Break 3:00 PM - 3:30 PM**

**Afternoon Session II (3:30 AM - 5:00 PM)**

**Chair: Conor Houghton**

3:30 PM - 4:15 PM

**Minimum and maximum entropy distributions for binary systems with known means and pairwise correlations**

Mike DeWeese, Physics Department and the Helen Wills Neuroscience Institute, UC Berkeley

Maximum entropy models are increasingly being used to describe the collective activity of neural populations with measured mean neural activities and pairwise correlations, but the full space of probability distributions consistent with these constraints has not been explored. We provide lower and upper bounds on the entropy for both the minimum and maximum entropy distributions over binary units with any fixed set of mean values and pairwise correlations, and we construct distributions for several relevant cases. Surprisingly, the minimum entropy solution has entropy scaling logarithmically with system size, unlike the possible linear behavior of the maximum entropy solution, for any set of first- and second-order statistics consistent with arbitrarily large systems. We also demonstrate with a simple example that some sets of low-order statistics can only be realized by small systems and I will discuss how this impacts our interpretation of neural population activity.

4:15 PM - 5:00 PM

**DISCUSSION**

**Thursday, 9:00 AM - 12:15 PM, July 31, 2014**

**Morning Session I (9:00 AM - 10:30 PM)**

**Chair: Aurel A. Lazar**

9:00 AM - 9:45 AM

**Emergence of perceptual invariances in auditory processes**

Alexander G. Dimitrov, Department of Mathematics, Washington State University - Vancouver.

The sense of hearing is an elaborate perceptual process. Sounds reaching our ears vary in multiple features: pitch, intensity, rate. Yet when we parse speech, our comprehension is little affected by the vast variety of ways in which a single phrase can be uttered. This amazing ability to extract relevant information from wildly varying sensory signals is also ubiquitous in other sensory modalities, and is by no means restricted only to human speech.

Even though the effect itself is well characterized, we do not understand the approaches used by different neural systems to achieve such performance. In this project, we test the hypothesis that broadly invariant signal processing is achieved through various combinations of locally invariant elements. We have studied locally invariant sensory processing in several biological systems and noted its widespread presence. The main questions we would like to address here are: 1. What are the characteristics of locally invariant units in auditory pathways? 2. How can biological locally invariant units be combined to form globally invariant representations? 3. What information is being represented at different levels of the auditory hierarchy?

9:45 AM - 10:30 AM

**Characterization of local invariances in the ascending ferret auditory system**

Jean F. Lienard, Oregon Hearing Research Center, Oregon Health & Science University

Local probabilistic invariances, defined by the range of physical transformations that can be applied to a sensory stimulus without changing the corresponding neural response, are largely unstudied in auditory cortex. We propose to assess these invariances using existing and new experimental neurophysiological data recorded from multiple stages of the ferret auditory processing hierarchy.

We hypothesize that neurons in the auditory pathway will show increasing degrees of local invariance at successively more central stages of the processing hierarchy. To test this hypothesis, we analyzed spiking activity recorded from single neurons in the primary auditory cortex (A1) and in the secondary auditory cortex (PEG) of awake, passively listening animals during presentation of two types of stimuli commonly known to evoke activity in the auditory system. The first set of stimuli was a sequence of isolated pure tones with randomly varying frequency, spanning 5-6 octaves and encompassing the best frequencies of the recorded neurons. The second was a sequence of bandpass noise bursts with varied durations, consisting of 20-30 bursts that logarithmically tile 5-7 octaves, thus achieving a bandwidth each of approximately 0.25 octaves. Using these data, we have analyzed local invariance to frequency shifts by estimating the width of the tuning curve for the best responding neurons. We have also characterized local invariance to time shifts with the recordings obtained in response to bandpass noise by modeling the temporal jitter of individual neurons as independent processes, indicating as preliminary results that the invariance to time shift is higher in PEG than in A1.

**Morning Break 10:30 AM - 11:00 AM**

**Morning Session II (11:00 AM - 12:15 PM)**

**Chair: Aurel A. Lazar**

11:00 AM - 11:45 AM

**Learning to generate population activity patterns**

Byron Yu, Electrical & Computer Engineering and Biomedical Engineering, Carnegie Mellon University

The information that a population of neurons can convey is related to the richness of the activity patterns that they exhibit. Recent studies have demonstrated in different brain areas that population activity tends to explore a low-dimensional space within the higher-dimensional space of firing rates, where each axis represents the activity of one neuron. Here, we ask whether the low-dimensional space is solely a statistical description of the population activity, or whether it might also reflect causal constraints imposed by the underlying neural circuitry on the diversity of activity patterns that can be exhibited. We studied this question using a brain-computer interface (BCI) paradigm, which allowed us to specify which patterns of population activity we would like the subject to show. We found that non-human primate subjects had difficulty learning to generate activity patterns outside of the low-dimensional space, even if they were necessary for task success.

11:45 AM - 12:15 PM

**DISCUSSION**