CUTE Workshop 2010


Columbia University - Technion

Workshop on Neuroengineering of Biological Networks

Tuesday and Wednesday, March 16-17, 2010

Interschool Lab, Shapiro Research Building (Room 750 CEPSR), Columbia University


Overview

This joint workshop, hosted by the School of Engineering and Applied Sciences (SEAS) of Columbia University (CU) and the Lorry I. Lokey Interdisciplinary Center for Life Sciences and Engineering (CLSE) at the Technion – Israel Institute of Technology (Technion) will focus on the contributions of engineering to understanding neuronal networks. Whether at the micro, meso or macro scale, the challenge of understanding and characterizing the information processing capabilities of such networks is inherently a multi-disciplinary enterprise requiring principles from engineering (e.g., system identification, information theory, control theory, signal processing, reverse engineering, simulation, fabrication) as well as neurobiology. This workshop is a unique opportunity to bring together engineers and neurobiologists who are eager to share their ideas and collaborate in order to build strategic international partnerships between SEAS-CU and CLSE-Technion.

In a larger context, the CU-Technion joint workshop will be the catalyst for a broader collaborative effort to redefine how engineering interacts and participates with the life sciences. One view of engineering is that it provides the tools needed to facilitate basic life science discoveries. However it is becoming increasingly clear that engineering is providing more than just tools, and instead one can best approach life science questions from an engineering perspective. This is a new paradigm for studying biology, one that is being embraced at CLSE-Technion and also SEAS-CU. The long term goal, starting with this first joint workshop, will be to exploit the strength of SEAS-CU and CLSE-Technion in building a Global Collaborative Center for Engineering and Life Sciences.


Organizing (Standing) Committee

Aurel A. Lazar, Columbia University
Shimon Marom, Technion
Paul Sajda, Columbia University
Yehoshua Y. (Josh) Zeevi, Technion


Program Overview

Tuesday (8:45 AM - 5:10 PM), March 16, 2010

8:45 AM - 9:00 AM

Welcoming Remarks, Organizing Committee

Morning Session (9:00 AM - 12:10 PM)
Neural Encoding and Processing
Chair: Miriam Zacksenhouse, Technion


9:00 AM - 9:40 AM

How the Olfactory Bulb got its Glomeruli: A Just so Story?

Stuart Firestein, Department of Biological Sciences, Columbia University

The nearly 2,000 glomeruli that cover the surface of the olfactory bulb are so distinctive that they were noted specifically in the earliest of Cajal’s catalogues. They have variously been considered a functional unit, an organizational unit and a crucial component of the olfactory coding circuit. Despite their central position in olfactory processing, the development of the glomeruli has only recently begun to be investigated with new and powerful genetic tools. Some unexpected findings have been made that may lead to a new understanding of the processes involved in wiring sensory regions of the brain. It may no longer be sufficient to simply invoke genes, spikes and their interplay in the construction of brain circuits. The story of ‘how the olfactory bulb got its glomeruli’ may be more complex, and more revealing, than has been supposed.


9:40 AM - 10:20 AM

Non linear dendritic processing in cortical pyramidal neurons

Jackie Schiller, Faculty of Medicine, Technion.

Neurons in the central nervous system typically posses an elaborated dendritic tree, which serves to receive and integrate the vast input information arriving to the neuron. Understanding the way information is processed in dendrites is crucial for comprehending the input/output transformation functions of individual CNS neurons, and in turn learning how cortical networks code and store information. Cortical pyramidal neurons, which are the major excitatory neurons in the cortical tissue, have a typical dendritic tree consisting of a large apical trunk which branches to form the oblique and tuft branches, and a basal tree branching directly from the soma. the present work we concentrated on understanding how tuft dendrites process their incoming information. Tuft dendrites are the main target for feedback inputs innervating neocortical layer-5 pyramidal neurons but their properties remain obscure. We report the existence of NMDA-spikes in the fine distal tuft dendrites that otherwise did not support the initiation of calcium spikes. Both direct measurements and computer simulations showed that NMDA-spikes are the dominant mechanism by which distal synaptic input leads to firing of the neuron and provide the substrate for complex parallel processing of top-down input arriving at the tuft. These data lead to a new unifying view of integration in pyramidal neurons in which all fine dendrites, basal and tuft, integrate inputs locally through the recruitment of NMDA receptor channels relative to the fixed apical calcium and axo-somatic sodium integration points.


10:20 AM - 10:50 AM

Morning Break

10:50 AM - 11:30 AM

System Identification of Drosophila Olfactory Sensory Neurons

Aurel A. Lazar*, Department of Electrical Engineering, Columbia University

The lack of a deeper understanding of how olfactory receptor neurons (ORNs) encode odors has hindered progress in understanding olfactory signal processing in higher brain centers. We investigate the encoding of time-varying odor stimuli by Drosophila ORNs and their spike domain representation for further processing by the network of glomeruli. A wide range of time-varying odor waveforms were used in in-vivo recordings of ORNs expressing the same receptor. Spiking activity of single ORNs activated by essentially the same odor waveforms could be evaluated from repeated experiments for a wide range of concentration and concentration gradient value pairs. In order to evaluate the spike-timing precision, we simultaneously recorded from a single fly the activity of two neurons expressing the same receptor. Overall, we recorded the spiking activity of (i) neurons expressing different receptors in response to the same odorant, and (ii) neurons expressing the same receptor in response to different odorants. Our work demonstrates an adaptive two-dimensional encoding mechanism for Drosophila ORNs. At very low odorant concentrations, ORNs encode positive concentration gradients. Conversely, at high concentrations ORNs encode the odorant concentration. The 2D encoding manifold clearly shows that Drosophila ORNs encode both odor concentration and concentration gradient and provides a quantitative description of the neural response with a predictive power not seen before.

*Joint work with Anmo J. Kim and Yevgeniy Slutskiy.


11:30 AM - 12:10 PM

Neural Mechanisms of Stimulus Selection: Lessons from the Barn Owl

Yoram Gutfreund, Faculty of Medicine, Technion.

To cope with the immense amounts of sensory information provided to the brain and to ensure that behavior is controlled by important information, mechanisms have evolved to select the most behaviorally significant stimulus. This focusing on a particular part of the scene is known as selective attention. The Barn owl, whose highly accurate visual and auditory systems evolved to detect small prey in acoustically complex and dimly lit environments, provides an interesting case for the study of neural mechanisms for stimulus selection. In recent years my lab focused on studying stimulus-specific adaptation and visual-auditory integration in the gaze control system of the barn owl. Our findings highlight the role of these two neural phenomena in stimulus selection and suggest the tectofugal pathway as a key player in this process.


12:10 PM - 2:00 PM

Lunch

Afternoon Session (2:00 PM - 5:10 PM)
Neural Encoding and Decoding
Chair: Stefano Fusi, Department of Neuroscience, Columbia University


2:00 PM - 2:40 PM

Analyzing High Temporal Resolution fMRI Data

Martin A. Lindquist Department of Statistics, Columbia University.

Understanding the neural basis of human brain function requires a detailed knowledge of both the spatial and temporal aspects of information processing. Functional magnetic resonance imaging (fMRI) provides the capability of visualizing changes in neuronal activity with high spatial resolution, but is lacking in temporal resolution. Most statistical techniques for analyzing fMRI data are based on studying oxygenation patterns that are far removed from the underlying event we wish to base our conclusions on (i.e. the neural activity). Therefore, one faces the statistically intractable task of sorting out possibly unknown confounding factors influencing the timing of these oxygenation patterns across different regions of the brain, in comparison to the actual ordering of their neuronal activity. Recently, new techniques have been introduced that can dramatically increase the temporal resolution of fMRI studies, and provide temporal activation profiles with significantly more detail than has previously been available in fMRI studies. Proper analysis of this high temporal resolution data necessitates the development of statistical methods for scoring observed activation responses that can be used for determining the absolute order of activation between different brain regions. We will discuss such techniques and compare them with standard metrics for determining temporal ordering in fMRI.


2:40 PM - 3:20 PM

AGC in Neural Networks and Cliques in Small Ensembles of Neurons

Yehoshua (Josh) Y. Zeevi*, Department of Electrical Engineering, Technion.

Automatic Gain Control (AGC) in neural networks and "Cliques" in relatively small ensembles of neurons are considered as examples of principles of organization that emerge from combining concepts based on engineering and neurobiology. These are considered in the context of representation and enhancement of sensory data and of complex patterns of relationships therein. The AGC is shown to result from simple nonlinear synaptic interactions within a retina-type network feedback loop. It is argued that this is a principle of organization that repeats across modalities and along the organizational hierarchy and permits enhancement of a wide range of sensory signal attributes. Cliques extend and generalize the concept of Synfire Chains of Abeles, in that they constitute complex spatio-temporal spiking activity within and across several (even disjoined) synfire chains, that cohere to signal (highlight) the recognition of complex patterns of sensory stimuli within or across modalities, or the arousal of complex percepts. Cliques are highly compressed representations of complex patterns. Yet, like Hopfield-type attractors they preserve structural relationships in the compressed space. Sensory signal representation by cliques constitutes a many-to-one mapping that preserves the complex structural relationships of the uncompressed patterns. To highlight the parsimonious characteristics of mapping and storage by cliques, we embed this mapping in a computational framework of Liquid State Machines (LSM)―A computational paradigm proposed by Maass, Natschlager and Markram as a model of generic cortical microcircuits. We extend the power of LSM by closing the loop. This synthetic approach to the analysis of neural computational paradigms yields surprising important results. For example, a small ensemble of the order of one hundred neurons can store a class of visual stimuli (i.e. images) that are generated by a wide range of rigid-body and elastic distortions of a given natural image. Such invariant perceptrons of highly-compressed images (or other sensory signals) can be further generalized to higher levels of processing and mapping of concepts by “Conceptrons”. Conclusions regarding possible execution of higher brain functions by relatively simple and small ensembles of cortical cells will be addressed.

*Joint research with Igal Raichelgauz and Karina Odinaev, Cortica Ltd.


3:20 PM - 3:50 PM

Afternoon Break

3:50 PM - 4:30 PM

Representation in Large Scale Random Networks of Cortical Neurons

Shimon Marom, Faculty of Medicine, Technion.

In this lecture I describe some interrelations between neural dynamics, structure and function performance. Specifically, I experimentally address the biophysical constrains that emerge from cellular dynamics and network topology, and measure the entailed impacts on ensemble representation of spatial and temporal input features.


4:30 PM - 5:10 PM

Neural rate modulations during experiments with Brain Machine Interfaces: implications for neural encoding and motor control

Miriam Zacksenhouse, Department of Mechanical Engineering, Technion.

Neural encoding is usually investigated with respect to specific stimuli or actions, which are either measured or under experimental control. However, neurons may encode multiple signals, including hidden or internal signals, which are not under experimental control. Instead we have derived a method for estimating the variance of the neural activity associated with rate modulations, without any assumption about the encoded signals. The method is based on the sole assumption that the spike trains are realizations of either doubly stochastic Poisson process (DSPP) or dead-time modified DSPP. The analysis can be quantified in terms of the percent variance attributed to neural modulations, or equivalently, as the signal-to-noise ratio. Applied to investigating the neural activity recorded during experiments with Brain machine Interfaces (BMIs), our method reveals: (i) How neural modulations are affected by the operation of the BMI, and (ii) What are the relevant timescales during BMI operation.

Effect of BMI operation on neural modulations: We have previously shown that the extent of neural modulations increases abruptly upon starting to operate the interface, especially after stopping hand movements. In contrast, neural modulations that are correlated with the trajectories of the controlled cursor and target remain relatively unchanged.

Implications for motor control: The above results may be interpreted within the frameworks of either optimal control or reinforcement learning. Here we concentrate on the framework of optimal control and investigate the conditions under which similar changes can be generated.

Relevant timescales: In agreement with theoretical predictions, the signal-to-noise ratio (SNR) increases with the bin-width, almost linearly at short bin-widths and at lower rate at longer bin-widths. While the SNR increases with the bin-width, the update rate decreases, suggesting that the ratio between the SNR and the bin-width should be considered as a criterion for determining the relevant timescales. The importance of this ratio is also motivated by its relation to the capacity of the neural channel. Evaluating the mean SNR across all the recorded neurons we show that the ratio between the SNR and the bin-width peaks around 100msec – in agreement with the bin-width selected by trial-and-error for operating the BMI. Nevertheless, our analysis demonstrates that the relevant timescales vary across neurons and brain regions, potentially due to the different bandwidths of the encoded signals, and suggests that this may be explored to enhance BMIs.


Wednesday (9:00 AM - 5:10 PM), March 17, 2009

Morning Session (9:00 AM - 12:10 PM)
Neural Coding and Networks
Chair: Ron Meir, Technion


9:00 AM - 9:40 AM

Cortical Networks Underlying Perceptual Decision Making in the Human Brain

Paul Sajda, Department of Biomedical Engineering, Columbia University.

Single and multiunit recordings in primates have already established that decision making involves at least two general stages of neural processing; representation of evidence from early sensory areas and accumulation of evidence to a decision threshold from decision-related regions. However, the relay of information from early sensory to decision areas, such that the accumulation process is instigated, is not well understood. We first describe a method by where we use information derived from our previous EEG recordings to inform the analysis of fMRI data collected for the same behavioral task in order to ascertain the cortical origins of each of these EEG components. We demonstrate that a cascade of events associated with perceptual decision making takes place in a highly distributed neural network. Of particular importance is an activation in the lateral occipital complex implicating perceptual persistence as a mechanism by which object decision making in the human brain is instigated. We also describe our methodology and results for single-trial analysis of simultaneous EEG and fMRI and discuss these within the context of perceptual decision making and shifts of attention.


9:40 AM - 10:20 AM

Neural coding of communication vocalizations in the songbird brain

Sarah Woolley, Department of Psychology, Columbia University.

A primary task of the social brain is to create neural representations of communication signals that result in the perception of social messages. Songbirds are particularly interesting because they learn to recognize, respond to and produce the complex songs that they use to communicate. I will describe some aspects of how songbird auditory neurons are tuned to the acoustic features of songs, how auditory coding differs when the brain processes song versus other sounds, and how individual songs are encoded by single neurons and groups of neurons.


10:20 AM - 10:50 AM

Morning Break

10:50 AM - 11:30 AM

Spatio-Temporal mapping of speech processing in human subjects

Hillel Pratt, Faculty of Medicine and Department of Biomedical Engineering, Technion.

Event-Related Potentials allow non-invasive recording of human electrical brain activity with a temporal resolution in the order of msec. Coupled with novel signal processing methods, the sources of the scalp-recorded electrical activity can be estimated with a spatial resolution of a few mm. In a series of studies, human brain activity during speech processing was studied in normal subjects as well as in patients with a variety of impairments affecting auditory function and behavior. The spatio-temporal distribution of activity was mapped and compared between patients and normals, as well as among normals with different levels of proficiency in different languages. Spatio-temporal mapping of human brain activity was found to be sensitive to specific impairments, as well as to different levels of speech processing. Hemispheric lateralization of activity was found to be affected not only by the type of stimulus presented (verbal/non-verbal), but also to the stage of processing (early-acoustic vs late semantic) and linguistic proficiency (first language vs second language). The added temporal dimension holds promise of enhancing functional brain imaging as a research and diagnostic tool.


11:30 AM - 12:10 PM

Coding and computation by neural ensembles in the primate retina

Liam Paninski, Department of Statistics, Columbia University.


12:00 PM - 2:00 PM

Lunch

Afternoon Session (2:00 PM - 5:10 PM)
Multi-Cell Analysis
Chair: Ning Qian, Department of Neuroscience, Physiology & Cellular Biophysics, Columbia University


2:00 PM - 2:40 PM

History Dependent Dynamics in a Generic Model of Ion Channels

Ron Meir Department of Electrical Engineering, Technion.

Recent experiments have demonstrated that the timescale of adaptation of single neurons and ion channel populations to stimuli slows down as the length of stimulation increases; in fact, no upper bound on temporal time-scales seems to exist in such systems. Patch clamp experiments on single ion channels have hinted at the existence of large, mostly unobservable, inactivation state spaces within a single ion channel. These results raise the challenge of constructing mathematical models, which, on the one hand, explain the observed experimental phenomena, while, on the other hand, retain sufficient physiological relevance and mathematical tractability to be completely analyzed. Such models should ideally be based on a minimal number of experimentally measurable parameters, while avoiding, as much as possible, quantities which cannot be experimentally determined. In this work we propose a model for ion channel dynamics which addresses these desiderata, and present a detailed mathematical analysis of its properties. The model leads to a clear and concise explanation of observed phenomena pertaining to the lack of time scales and to history-dependent responses, while making specific predictions about novel experimental situations. More specifically, we reproduce experimentally observed exponential history-dependent relaxation in sodium channels in a voltage clamp setting, and show that their recovery rate from slow inactivation must be voltage dependent. We predict that history-dependence is significantly reduced for sparse spiking inputs. While the model was created with ion channel populations in mind, its simplicity and genericalness render it a good starting point for modeling similar effects in other systems, and for scaling up to higher levels.


2:40 PM - 3:20 PM

Detecting and Predicting Epileptic Seizures: Progress and Challenges

David L. Waltz, Center for Computational Learning Systems, Columbia University.

Around 35% of epileptic patients are not helped by available drug treatments. Their only option today is surgical removal of the epileptic focus, which helps another 10% of patients. The goal of the CCLS/Columbia Neurological Institute Epilepsy Project is to develop systems to help the rest of this population by developing systems that can reliably predict seizures in advance via an implanted device, providing a warning or electrical or drug intervention to prevent a seizure. This goal is very challenging: first, it is not known whether there are indicators in advance for all seizures; each patient differs in the location of the focus and onset pattern(s); and signals are very small relative to artifacts (from chewing, eye blinks, stray electrical interference, etc.). Even detection of seizures has proved challenging, especially detection of very short seizures. Our project team has been collecting and analyzing data from electrode arrays that are implanted for about two weeks in patients with intractable epilepsy. Each patient produces about 8 Tbytes of data, hopefully including at least three large seizures, along with unknown numbers of small seizures, and a very large quantity of non-seizure data. Our initial work aims for systems that can detect all seizures, using machine learning applied to a human labeled “gold standard” training corpus. With this goal in hand, we will then be in a position to seek predictive patterns for seizures.

This is joint work with Cathy Schevon and Ron Emerson (Columbia Neurological Institute), and Haimonti Dutta, Ansaf Salleb-Aouissi, Phil Gross, and Hatim Diab (CCLS).


3:20 PM - 3:50 PM

Afternoon Break

3:50 PM - 4:30 PM

Network dynamics during development of pharmacologically induced epileptic seizures in rats in-vivo

Yitzhak Schiller, Faculty of Medicine, Technion.

In epilepsy the cortical network fluctuates between the asymptomatic inter-ictal state and the symptomatic ictal state of seizures. Despite its importance the network dynamics responsible for the transition between the inter-ictal and ictal states are largely unknown. Here we used multi-electrode single-unit recordings from the hippocampus to investigate the network dynamics during development of seizures evoked by various chemoconvulsants in-vivo. In these experiments we detected a typical network dynamics signature that preceded seizure initiation. The pre-ictal state preceding Pilocarpine-, Kainate- and Picrotoxin-induced seizures was characterized by biphasic network dynamics composed of an early desynchronization phase in which the tendency of neurons to fire correlated action potentials decreased, followed by a late resynchronization phase in which the activity and synchronization of the network gradually increased. This biphasic network dynamics preceded both the initiation of the initial seizure and of recurrent spontaneous seizures that followed. During seizures firing of individual neurons and inter-neuronal synchronization further increased. These findings advance our understanding of the network dynamics leading to seizure initiation, and may in future help in development of novel seizure prediction algorithms.


4:30 PM - 5:10 PM

Disruption of hippocampal network electrophysiology by mechanical injury

Barclay Morrison, Department of Biomedical Engineering, Columbia University.

Traumatic brain injury (TBI) is a major health concern in the US with over 1.5 million occurring annual or about one TBI every 20 seconds. TBI is the result of mechanical deformation of the brain due to physical forces such as a fall, assault, or motor vehicle accident. The mechanical stimulus initiates an extended cascade of molecular and cellular events which culminate in disruption of the brain’s function. At the tissue level, this dysfunction can be quantified electrophysiologically.

We have developed an in vitro model of TBI which allows for the precise control over the initiating mechanical stimulus as well as a high degree of access to the injured tissue. We have combined brain slice cultures with microelectrode arrays (MEA) to quantify the post-traumatic dysfunction of the biological neuronal network that comprises the hippocampus. At low injury severities, the maximal evoked filed potential response (Rmax) of the hippocampus was reduced, and the stimulus to evoke a half maximal response (I50) was increased compared to uninjured controls. Paradoxically, at higher injury severities, Rmax recovered and I50 was reduced, suggesting increased excitability due to a preferential loss of inhibitory circuits within the neuronal network.

The microelectrode array (60 electrodes) provides a wealth of simultaneous electrophysiological data with high spatial density. These recordings as can be used to determine the evolution of evoked current source density patterns. The current flow follows the hippocampal circuitry and is qualitatively altered after injury, however, our current analysis does not leverage this wealth of spatial-temporal information. We hope to discuss potential strategies to quantify this activity to support statistical comparisons between injured and control cultures by taking full advantage of the MEA recording platform.