Columbia Workshop on Brain Circuits, Memory and Computation 2020


BCMC 2020

Monday and Tuesday, March 16-17, 2020

Davis Auditorium, CEPSR

Center for Neural Engineering and Computation

Columbia University, New York, NY 10027


Overview

The goal of the workshop is to bring together researchers interested in developing executable models of neural computation/processing of the brain of model organisms. Of interest are models of computation that consist of elementary units of processing using brain circuits and memory elements. Elementary units of computation/processing include population encoding/decoding circuits with biophysically-grounded neuron models, non-linear dendritic processors for motion detection/direction selectivity, spike processing and pattern recognition neural circuits, movement control and decision-making circuits, etc. Memory units include models of spatio-temporal memory circuits, circuit models for memory access and storage, etc. A major aim of the workshop is to explore the integration of various sensory and control circuits in higher brain centers.

A Fruit Fly Brain Hackathon is being conducted in conjunction with the workshop. Workshop participants are welcome to attend the hackathon.

Organizer and Program Chair

Aurel A. Lazar, Department of Electrical Engineering, Columbia University.

Registration

Registration is free but all participants have to register. Thank you!

Lodging and Directions to Venue

Please follow this link for lodging details and directions to the hotel and venue.

Sponsorship

The 2020 Columbia Workshop on Brain Circuits, Memory and Computation is supported by the

Department of Electrical Engineering, Columbia University

Center for Computing Systems for Data-Driven Science, Data Science Institute, Columbia University

School of Engineering and Applied Science, Columbia University




Confirmed Invited Speakers

Sophie Caron, Department of Biology, University of Utah, Salt Lake City, UT.


A Similarity-preserving Neural Network Trained on Transformed Images Recapitulates Salient Features of the Fly Motion Detection Circuit

Dmitri ‘Mitya’ Chklovskii, Flatiron Institute, Simons Foundation.

Learning to detect content-independent transformations from data is one of the central problems in biological and artificial intelligence. An example of such problem is unsupervised learning of a visual motion detector from pairs of consecutive video frames. Rao and Ruderman formulated this problem in terms of learning infinitesimal transformation operators (Lie group generators) via minimizing image reconstruction error. Unfortunately, it is difficult to map their model onto a biologically plausible neural network (NN) with local learning rules. Here we propose a biologically plausible model of motion detection. We also adopt the transformation-operator approach but, instead of reconstruction-error minimization, start with a similarity-preserving objective function. An online algorithm that optimizes such an objective function naturally maps onto an NN with biologically plausible learning rules. The trained NN recapitulates major features of the well-studied motion detector in the fly. In particular, it is consistent with the experimental observation that local motion detectors combine information from at least three adjacent pixels, something that contradicts the celebrated Hassenstein-Reichardt model.

Joint work with Yanis Bahroun and Anirvan Sengupta.


Ronald L. Davis, Scripps Research Florida, Jupiter, FL.

Monica Dus, Dept. of Molecular, Cellular, and Developmental Biology, University of Michigan, Ann Arbor, MI.

Nathan W. Gouwens, Allen Institute of Brain Science, Seattle, WA.


How States and Needs Shape Odor Perception and Behavior

Ilona Grunwald Kadow, School of Life Sciences, Technical University of Munich.

Neuromodulation permits flexibility of synapses, neural circuits and ultimately behavior. One neuromodulator, dopamine, has been studied extensively in its role as reward signal during learning and memory across animal species. Newer evidence suggests that dopaminergic neurons (DANs) can modulate sensory perception acutely, thereby allowing an animal to adapt its behavior and decision-making to its internal and behavioral state. In addition, some data indicate that DANs are heterogeneous and convey different types of information as a population. We have investigated DAN population activity and how it could encode relevant information about sensory stimuli and state by taking advantage of the confined anatomy of DANs innervating the mushroom body (MB) of the fly Drosophila melanogaster. Using in vivo calcium imaging and a custom 3D image registration method, we find that the activity of the population of MB DANs is predictive of the innate valence of an odor as well as the metabolic and behavioral state of the animal, suggesting that distinct DAN population activities encode innate odor valence, movement and physiological state in a MB compartment specific manner. This information could influence perception and state-dependent decision making as suggested by behavioral analysis. We propose that dopamine shapes innate odor perception through combinatorial population coding of sensory valence, physiological and behavioral context.


Frank Hirth, Basic and Clinical Neuroscience, King's College, London.

Viren Jain, Google AI, Mountain View, CA.


Navigational Insights from the Fly Central Complex Connectome

Vivek Jayaraman, HHMI Janelia Research Campus, Ashburn, VA.

The topology of a network is often informative about its function. Extracting detailed neural network structure has recently become possible for the fly brain through advances in electron microscopy (EM) and automatic reconstruction. The Janelia FlyEM team, working in collaboration with multiple Janelia labs and Google, has put together the first EM-based connectome of a highly conserved insect brain region called the central complex (CX), including all its neurons and most of their synaptic connections. This highly recurrent central brain region, which is composed of ~3000 identified neurons enables flies to maintain an arbitrary heading over kilometers, form visually-guided spatial memories, use internal models of their body size when performing motor tasks, and consolidate memories during sleep. CX neurons show clear signatures of ring attractor dynamics during navigation in visual environments and in darkness and are modulated by past experience, satiety, circadian rhythm and sleep state. Many conceptual and computational models have explored how the CX might perform some of these functions. However, these models have largely relied on anatomical overlap and functional connectivity to construct putative CX circuits. Our data shed new light on the region, revealing several exquisitely-structured functional motifs for navigation and demonstrates the power of analyzing anatomical structure to generate hypotheses for function in the brain.


Binocular Photoreceptor Microsaccades Give Fruit Fly Hyperacute 3D-Vision

Mikko I. Juusola, Centre for Cognition in Small Brains, The University of Sheffield.

Neural mechanisms behind stereovision, which requires simultaneous disparity inputs from two eyes, have remained mysterious. Here we show how ultrafast synchronous mirror-symmetric photomechanical contractions in the frontal forward-facing left and right eye photoreceptors give Drosophila super-resolution 3D-vision. By combining in vivo 100-nm-resolution x-ray imaging with electrophysiology and fly genetics, in vivo high-speed optical imaging, mathematical modelling and behavioural paradigms, we reveal how these photoreceptor microsaccades - by verging and narrowing the eyes’ overlapping receptive fields - channel depth information, as phasic binocular image motion disparity signals in time, to hyperacute stereovision and learning. We further show how peripherally, outside the stereoscopic sampling, photoreceptor microsaccades match a forward flying fly’s optic flow field to better resolve the world in motion. These results change our understanding of how insect compound eyes work, highlight the importance of fast photoreceptor vergence for enhancing 3D perception, and suggest coding strategies to improve man-made sensors.


What Does a Cognitive Goal Look Like in the Brain?

Gaby Maimon, Laboratory of Integrative Brain Function, The Rockefeller University.

I will discuss a neural circuit that begins to explain how flies compare the angle in which they are currently oriented in the world with the angle in which they wish to be oriented - a goal heading angle that they can flexibly change - to determine which way to turn, how hard to turn, and how fast to walk forward. This detailed circuit in a small brain should inspire one to think more clearly about how larger mammalian brains, like our own, set goals and then compel behaviors to achieve those goals.


Alexey A. Polilov, Department of Entomology, Faculty of Biology, Lomonosov Moscow State University.

Michael B. Reiser, Janelia Research Campus, Ashburn, VA.


Universality of Information Encoding in Brain Regions Using a Specific Combinatorial Code

Charles F. Stevens, The Salk Institute and Kavli Institute for Brain and Mind, UCSD.

Information in the brain is believed to be usually encoded by which neurons are activated by a stimulus. For example, in the primary visual cortex, the slopes of lines or edges at a particular location in the visual scene are encoded by which orientation selective neurons are active in response to the line or edge. In other brain areas, however, many of the same neurons are activated by every stimulus, so information is encoded not by which neurons are active but by a pattern of activity in a population of neurons that respond to most stimuli. Such brain regions use a combinatorial code to distinguish between alternative stimuli.

An example of such a brain region is the population of projection neurons in the fruit fly antennal lobe. About a dozen copies of each of about 50 genetically distinct types of odorant receptor neurons (ORNs) are present in the fly’s nose, and all of the neurons of the same type project to one of about fifty glomeruli in the antennal lobe. The output of each antennal lobe glomerulus is one of about 50 types of projection neurons that send olfactory information about odor type to Kenyon Cells in the mushroom body. Most of the 50 types of antennal lobe projection neurons fire in response to most odors.

The distribution of firing rates across the antennal lobe responsive projection neurons is the same for almost all odors: it is an exponential distribution with a mean that is the same across all odor types. What differs from one odor to the next is not the distribution of firing rates, but rather is which neurons have which rates in the distribution, so that odor type is encoded by the pattern of firing rates across projection neurons.

This same combinatorial code is known to be used across two additional brain areas. One is the mouse olfactory system and the other is the monkey code used for human faces in the inferotemporal cortex. I will discuss these additional regions and describe the universality of the code they use.


John C. Tuthill, Department of Physiology and Biophysics, University of Washington, Seattle.


Neural Circuit Computations in Zebrafish Habenula

Emre Yaksi, Kavli Institute for Systems Neuroscience, NTNU, Norway.

We investigate how sensory information interacts with the internally generated dynamics of the brain, representing animals’ behavioral states. To achieve this, we focus on habenula, a conserved brain region associated with predicting potential outcomes, learning and mood disorders. We revealed that habenula encode both olfactory and visual cues, and act as a major hub integrating sensory information with the ongoing activity of the ancestral cortico-limbic structures on the zebrafish brain. We showed that different subnetworks of this circuitry are born during distinct developmental stages and serve different functions. Finally, we showed that perturbation of these circuits interferes with the animals’ ability to integrate new information during learning.



More information about BCMC 2020 can be found here.






Tweet this! Share on Facebook Email a friend Share on Delicious Share on StumbleUpon Share on Digg Share on Reddit Share on Technorati