Volume 17 Supplement 1

25th Annual Computational Neuroscience Meeting: CNS-2016

Open Access

25th Annual Computational Neuroscience Meeting: CNS-2016

BMC NeuroscienceBMC series – open, inclusive and trusted201617(Suppl 1):54

DOI: 10.1186/s12868-016-0283-6

Published: 18 August 2016

Table of contents

A1 Functional advantages of cell-type heterogeneity in neural circuits

Tatyana O. Sharpee

A2 Mesoscopic modeling of propagating waves in visual cortex

Alain Destexhe

A3 Dynamics and biomarkers of mental disorders

Mitsuo Kawato

F1 Precise recruitment of spiking output at theta frequencies requires dendritic h-channels in multi-compartment models of oriens-lacunosum/moleculare hippocampal interneurons

Vladislav Sekulić, Frances K. Skinner

F2 Kernel methods in reconstruction of current sources from extracellular potentials for single cells and the whole brains

Daniel K. Wójcik, Chaitanya Chintaluri, Dorottya Cserpán, Zoltán Somogyvári

F3 The synchronized periods depend on intracellular transcriptional repression mechanisms in circadian clocks.

Jae Kyoung Kim, Zachary P. Kilpatrick, Matthew R. Bennett, Kresimir Josić

O1 Assessing irregularity and coordination of spiking-bursting rhythms in central pattern generators

Irene Elices, David Arroyo, Rafael Levi, Francisco B. Rodriguez, Pablo Varona

O2 Regulation of top-down processing by cortically-projecting parvalbumin positive neurons in basal forebrain

Eunjin Hwang, Bowon Kim, Hio-Been Han, Tae Kim, James T. McKenna, Ritchie E. Brown, Robert W. McCarley, Jee Hyun Choi

O3 Modeling auditory stream segregation, build-up and bistability

James Rankin, Pamela Osborn Popp, John Rinzel

O4 Strong competition between tonotopic neural ensembles explains pitch-related dynamics of auditory cortex evoked fields

Alejandro Tabas, André Rupp, Emili Balaguer-Ballester

O5 A simple model of retinal response to multi-electrode stimulation

Matias I. Maturana, David B. Grayden, Shaun L. Cloherty, Tatiana Kameneva, Michael R. Ibbotson, Hamish Meffin

O6 Noise correlations in V4 area correlate with behavioral performance in visual discrimination task

Veronika Koren, Timm Lochmann, Valentin Dragoi, Klaus Obermayer

O7 Input-location dependent gain modulation in cerebellar nucleus neurons

Maria Psarrou, Maria Schilstra, Neil Davey, Benjamin Torben-Nielsen, Volker Steuber

O8 Analytic solution of cable energy function for cortical axons and dendrites

Huiwen Ju, Jiao Yu, Michael L. Hines, Liang Chen, Yuguo Yu

O9 C. elegans interactome: interactive visualization of Caenorhabditis elegans worm neuronal network

Jimin Kim, Will Leahy, Eli Shlizerman

O10 Is the model any good? Objective criteria for computational neuroscience model selection

Justas Birgiolas, Richard C. Gerkin, Sharon M. Crook

O11 Cooperation and competition of gamma oscillation mechanisms

Atthaphon Viriyopase, Raoul-Martin Memmesheimer, Stan Gielen

O12 A discrete structure of the brain waves

Yuri Dabaghian, Justin DeVito, Luca Perotti

O13 Direction-specific silencing of the Drosophila gaze stabilization system

Anmo J. Kim, Lisa M. Fenk, Cheng Lyu, Gaby Maimon

O14 What does the fruit fly think about values? A model of olfactory associative learning

Chang Zhao, Yves Widmer, Simon Sprecher,Walter Senn

O15 Effects of ionic diffusion on power spectra of local field potentials (LFP)

Geir Halnes, Tuomo Mäki-Marttunen, Daniel Keller, Klas H. Pettersen,Ole A. Andreassen, Gaute T. Einevoll

O16 Large-scale cortical models towards understanding relationship between brain structure abnormalities and cognitive deficits

Yasunori Yamada

O17 Spatial coarse-graining the brain: origin of minicolumns

Moira L. Steyn-Ross, D. Alistair Steyn-Ross

O18 Modeling large-scale cortical networks with laminar structure

Jorge F. Mejias, John D. Murray, Henry Kennedy, Xiao-Jing Wang

O19 Information filtering by partial synchronous spikes in a neural population

Alexandra Kruscha, Jan Grewe, Jan Benda, Benjamin Lindner

O20 Decoding context-dependent olfactory valence in Drosophila

Laurent Badel, Kazumi Ohta, Yoshiko Tsuchimoto, Hokto Kazama

P1 Neural network as a scale-free network: the role of a hub

B. Kahng

P2 Hemodynamic responses to emotions and decisions using near-infrared spectroscopy optical imaging

Nicoladie D. Tam

P3 Phase space analysis of hemodynamic responses to intentional movement directions using functional near-infrared spectroscopy (fNIRS) optical imaging technique

Nicoladie D.Tam, Luca Pollonini, George Zouridakis

P4 Modeling jamming avoidance of weakly electric fish

Jaehyun Soh, DaeEun Kim

P5 Synergy and redundancy of retinal ganglion cells in prediction

Minsu Yoo, S. E. Palmer

P6 A neural field model with a third dimension representing cortical depth

Viviana Culmone, Ingo Bojak

P7 Network analysis of a probabilistic connectivity model of the Xenopus tadpole spinal cord

Andrea Ferrario, Robert Merrison-Hort, Roman Borisyuk

P8 The recognition dynamics in the brain

Chang Sub Kim

P9 Multivariate spike train analysis using a positive definite kernel

Taro Tezuka

P10 Synchronization of burst periods may govern slow brain dynamics during general anesthesia

Pangyu Joo

P11 The ionic basis of heterogeneity affects stochastic synchrony

Young-Ah Rho, Shawn D. Burton, G. Bard Ermentrout, Jaeseung Jeong, Nathaniel N. Urban

P12 Circular statistics of noise in spike trains with a periodic component

Petr Marsalek

P14 Representations of directions in EEG-BCI using Gaussian readouts

Hoon-Hee Kim, Seok-hyun Moon, Do-won Lee, Sung-beom Lee, Ji-yong Lee, Jaeseung Jeong

P15 Action selection and reinforcement learning in basal ganglia during reaching movements

Yaroslav I. Molkov, Khaldoun Hamade, Wondimu Teka, William H. Barnett, Taegyo Kim, Sergey Markin, Ilya A. Rybak

P17 Axon guidance: modeling axonal growth in T-Junction assay

Csaba Forro, Harald Dermutz, László Demkó, János Vörös

P19 Transient cell assembly networks encode persistent spatial memories

Yuri Dabaghian, Andrey Babichev

P20 Theory of population coupling and applications to describe high order correlations in large populations of interacting neurons

Haiping Huang

P21 Design of biologically-realistic simulations for motor control

Sergio Verduzco-Flores

P22 Towards understanding the functional impact of the behavioural variability of neurons

Filipa Dos Santos, Peter Andras

P23 Different oscillatory dynamics underlying gamma entrainment deficits in schizophrenia

Christoph Metzner, Achim Schweikard, Bartosz Zurowski

P24 Memory recall and spike frequency adaptation

James P. Roach, Leonard M. Sander, Michal R. Zochowski

P25 Stability of neural networks and memory consolidation preferentially occur near criticality

Quinton M. Skilling, Nicolette Ognjanovski, Sara J. Aton, Michal Zochowski

P26 Stochastic Oscillation in Self-Organized Critical States of Small Systems: Sensitive Resting State in Neural Systems

Sheng-Jun Wang, Guang Ouyang, Jing Guang, Mingsha Zhang, K. Y. Michael Wong, Changsong Zhou

P27 Neurofield: a C++ library for fast simulation of 2D neural field models

Peter A. Robinson, Paula Sanz-Leon, Peter M. Drysdale, Felix Fung, Romesh G. Abeysuriya, Chris J. Rennie, Xuelong Zhao

P28 Action-based grounding: Beyond encoding/decoding in neural code

Yoonsuck Choe, Huei-Fang Yang

P29 Neural computation in a dynamical system with multiple time scales

Yuanyuan Mi, Xiaohan Lin, Si Wu

P30 Maximum entropy models for 3D layouts of orientation selectivity

Joscha Liedtke, Manuel Schottdorf, Fred Wolf

P31 A behavioral assay for probing computations underlying curiosity in rodents

Yoriko Yamamura, Jeffery R. Wickens

P32 Using statistical sampling to balance error function contributions to optimization of conductance-based models

Timothy Rumbell, Julia Ramsey, Amy Reyes, Danel Draguljić, Patrick R. Hof, Jennifer Luebke, Christina M. Weaver

P33 Exploration and implementation of a self-growing and self-organizing neuron network building algorithm

Hu He, Xu Yang, Hailin Ma, Zhiheng Xu, Yuzhe Wang

P34 Disrupted resting state brain network in obese subjects: a data-driven graph theory analysis

Kwangyeol Baek, Laurel S. Morris, Prantik Kundu, Valerie Voon

P35 Dynamics of cooperative excitatory and inhibitory plasticity

Everton J. Agnes, Tim P. Vogels

P36 Frequency-dependent oscillatory signal gating in feed-forward networks of integrate-and-fire neurons

William F. Podlaski, Tim P. Vogels

P37 Phenomenological neural model for adaptation of neurons in area IT

Martin Giese, Pradeep Kuravi, Rufin Vogels

P38 ICGenealogy: towards a common topology of neuronal ion channel function and genealogy in model and experiment

Alexander Seeholzer, William Podlaski, Rajnish Ranjan, Tim Vogels

P39 Temporal input discrimination from the interaction between dynamic synapses and neural subthreshold oscillations

Joaquin J. Torres, Fabiano Baroni, Roberto Latorre, Pablo Varona

P40 Different roles for transient and sustained activity during active visual processing

Bart Gips, Eric Lowet, Mark J. Roberts, Peter de Weerd, Ole Jensen, Jan van der Eerden

P41 Scale-free functional networks of 2D Ising model are highly robust against structural defects: neuroscience implications

Abdorreza Goodarzinick, Mohammad D. Niry, Alireza Valizadeh

P42 High frequency neuron can facilitate propagation of signal in neural networks

Aref Pariz, Shervin S. Parsi, Alireza Valizadeh

P43 Investigating the effect of Alzheimer’s disease related amyloidopathy on gamma oscillations in the CA1 region of the hippocampus

Julia M. Warburton, Lucia Marucci, Francesco Tamagnini, Jon Brown, Krasimira Tsaneva-Atanasova

P44 Long-tailed distributions of inhibitory and excitatory weights in a balanced network with eSTDP and iSTDP

Florence I. Kleberg, Jochen Triesch

P45 Simulation of EMG recording from hand muscle due to TMS of motor cortex

Bahar Moezzi, Nicolangelo Iannella, Natalie Schaworonkow, Lukas Plogmacher, Mitchell R. Goldsworthy, Brenton Hordacre, Mark D. McDonnell, Michael C. Ridding, Jochen Triesch

P46 Structure and dynamics of axon network formed in primary cell culture

Martin Zapotocky, Daniel Smit, Coralie Fouquet, Alain Trembleau

P47 Efficient signal processing and sampling in random networks that generate variability

Sakyasingha Dasgupta, Isao Nishikawa, Kazuyuki Aihara, Taro Toyoizumi

P48 Modeling the effect of riluzole on bursting in respiratory neural networks

Daniel T. Robb, Nick Mellen, Natalia Toporikova

P49 Mapping relaxation training using effective connectivity analysis

Rongxiang Tang, Yi-Yuan Tang

P50 Modeling neuron oscillation of implicit sequence learning

Guangsheng Liang, Seth A. Kiser, James H. Howard, Jr., Yi-Yuan Tang

P51 The role of cerebellar short-term synaptic plasticity in the pathology and medication of downbeat nystagmus

Julia Goncharenko, Neil Davey, Maria Schilstra, Volker Steuber

P52 Nonlinear response of noisy neurons

Sergej O. Voronenko, Benjamin Lindner

P53 Behavioral embedding suggests multiple chaotic dimensions underlie C. elegans locomotion

Tosif Ahamed, Greg Stephens

P54 Fast and scalable spike sorting for large and dense multi-electrodes recordings

Pierre Yger, Baptiste Lefebvre, Giulia Lia Beatrice Spampinato, Elric Esposito, Marcel Stimberg et Olivier Marre

P55 Sufficient sampling rates for fast hand motion tracking

Hansol Choi, Min-Ho Song

P56 Linear readout of object manifolds

SueYeon Chung, Dan D. Lee, Haim Sompolinsky

P57 Differentiating models of intrinsic bursting and rhythm generation of the respiratory pre-Bötzinger complex using phase response curves

Ryan S. Phillips, Jeffrey Smith

P58 The effect of inhibitory cell network interactions during theta rhythms on extracellular field potentials in CA1 hippocampus

Alexandra Pierri Chatzikalymniou, Katie Ferguson, Frances K. Skinner

P59 Expansion recoding through sparse sampling in the cerebellar input layer speeds learning

N. Alex Cayco Gajic, Claudia Clopath, R. Angus Silver

P60 A set of curated cortical models at multiple scales on Open Source Brain

Padraig Gleeson, Boris Marin, Sadra Sadeh, Adrian Quintana, Matteo Cantarelli, Salvador Dura-Bernal, William W. Lytton, Andrew Davison, R. Angus Silver

P61 A synaptic story of dynamical information encoding in neural adaptation

Luozheng Li, Wenhao Zhang, Yuanyuan Mi, Dahui Wang, Si Wu

P62 Physical modeling of rule-observant rodent behavior

Youngjo Song, Sol Park, Ilhwan Choi, Jaeseung Jeong, Hee-sup Shin

P64 Predictive coding in area V4 and prefrontal cortex explains dynamic discrimination of partially occluded shapes

Hannah Choi, Anitha Pasupathy, Eric Shea-Brown

P65 Stability of FORCE learning on spiking and rate-based networks

Dongsung Huh, Terrence J. Sejnowski

P66 Stabilising STDP in striatal neurons for reliable fast state recognition in noisy environments

Simon M. Vogt, Arvind Kumar, Robert Schmidt

P67 Electrodiffusion in one- and two-compartment neuron models for characterizing cellular effects of electrical stimulation

Stephen Van Wert, Steven J. Schiff

P68 STDP improves speech recognition capabilities in spiking recurrent circuits parameterized via differential evolution Markov Chain Monte Carlo

Richard Veale, Matthias Scheutz

P69 Bidirectional transformation between dominant cortical neural activities and phase difference distributions

Sang Wan Lee

P70 Maturation of sensory networks through homeostatic structural plasticity

Júlia Gallinaro, Stefan Rotter

P71 Corticothalamic dynamics: structure, number of solutions and stability of steady-state solutions in the space of synaptic couplings

Paula Sanz-Leon, Peter A. Robinson

P72 Optogenetic versus electrical stimulation of the parkinsonian basal ganglia. Computational study

Leonid L. Rubchinsky, Chung Ching Cheung, Shivakeshavan Ratnadurai-Giridharan

P73 Exact spike-timing distribution reveals higher-order interactions of neurons

Safura Rashid Shomali, Majid Nili Ahmadabadi, Hideaki Shimazaki, S. Nader Rasuli

P74 Neural mechanism of visual perceptual learning using a multi-layered neural network

Xiaochen Zhao, Malte J. Rasch

P75 Inferring collective spiking dynamics from mostly unobserved systems

Jens Wilting, Viola Priesemann

P76 How to infer distributions in the brain from subsampled observations

Anna Levina, Viola Priesemann

P77 Influences of embedding and estimation strategies on the inferred memory of single spiking neurons

Lucas Rudelt, Joseph T. Lizier, Viola Priesemann

P78 A nearest-neighbours based estimator for transfer entropy between spike trains

Joseph T. Lizier, Richard E. Spinney, Mikail Rubinov, Michael Wibral, Viola Priesemann

P79 Active learning of psychometric functions with multinomial logistic models

Ji Hyun Bak, Jonathan Pillow

P81 Inferring low-dimensional network dynamics with variational latent Gaussian process

Yuan Zaho, Il Memming Park

P82 Computational investigation of energy landscapes in the resting state subcortical brain network

Jiyoung Kang, Hae-Jeong Park

P83 Local repulsive interaction between retinal ganglion cells can generate a consistent spatial periodicity of orientation map

Jaeson Jang, Se-Bum Paik

P84 Phase duration of bistable perception reveals intrinsic time scale of perceptual decision under noisy condition

Woochul Choi, Se-Bum Paik

P85 Feedforward convergence between retina and primary visual cortex can determine the structure of orientation map

Changju Lee, Jaeson Jang, Se-Bum Paik

P86 Computational method classifying neural network activity patterns for imaging data

Min Song, Hyeonsu Lee, Se-Bum Paik

P87 Symmetry of spike-timing-dependent-plasticity kernels regulates volatility of memory

Youngjin Park, Woochul Choi, Se-Bum Paik

P88 Effects of time-periodic coupling strength on the first-spike latency dynamics of a scale-free network of stochastic Hodgkin-Huxley neurons

Ergin Yilmaz, Veli Baysal, Mahmut Ozer

P89 Spectral properties of spiking responses in V1 and V4 change within the trial and are highly relevant for behavioral performance

Veronika Koren, Klaus Obermayer

P90 Methods for building accurate models of individual neurons

Daniel Saska, Thomas Nowotny

P91 A full size mathematical model of the early olfactory system of honeybees

Ho Ka Chan, Alan Diamond, Thomas Nowotny

P92 Stimulation-induced tuning of ongoing oscillations in spiking neural networks

Christoph S. Herrmann, Micah M. Murray, Silvio Ionta, Axel Hutt, Jérémie Lefebvre

P93 Decision-specific sequences of neural activity in balanced random networks driven by structured sensory input

Philipp Weidel, Renato Duarte, Abigail Morrison

P94 Modulation of tuning induced by abrupt reduction of SST cell activity

Jung H. Lee, Ramakrishnan Iyer, Stefan Mihalas

P95 The functional role of VIP cell activation during locomotion

Jung H. Lee, Ramakrishnan Iyer, Christof Koch, Stefan Mihalas

P96 Stochastic inference with spiking neural networks

Mihai A. Petrovici, Luziwei Leng, Oliver Breitwieser, David Stöckel, Ilja Bytschok, Roman Martel, Johannes Bill, Johannes Schemmel, Karlheinz Meier

P97 Modeling orientation-selective electrical stimulation with retinal prostheses

Timothy B. Esler, Anthony N. Burkitt, David B. Grayden, Robert R. Kerr, Bahman Tahayori, Hamish Meffin

P98 Ion channel noise can explain firing correlation in auditory nerves

Bahar Moezzi, Nicolangelo Iannella, Mark D. McDonnell

P99 Limits of temporal encoding of thalamocortical inputs in a neocortical microcircuit

Max Nolte, Michael W. Reimann, Eilif Muller, Henry Markram

P100 On the representation of arm reaching movements: a computational model

Antonio Parziale, Rosa Senatore, Angelo Marcelli

P101 A computational model for investigating the role of cerebellum in acquisition and retention of motor behavior

Rosa Senatore, Antonio Parziale, Angelo Marcelli

P102 The emergence of semantic categories from a large-scale brain network of semantic knowledge

K. Skiker, M. Maouene

P103 Multiscale modeling of M1 multitarget pharmacotherapy for dystonia

Samuel A. Neymotin, Salvador Dura-Bernal, Alexandra Seidenstein, Peter Lakatos, Terence D. Sanger, William W. Lytton

P104 Effect of network size on computational capacity

Salvador Dura-Bernal, Rosemary J. Menzies, Campbell McLauchlan, Sacha J. van Albada, David J. Kedziora, Samuel Neymotin, William W. Lytton, Cliff C. Kerr

P105 NetPyNE: a Python package for NEURON to facilitate development and parallel simulation of biological neuronal networks

Salvador Dura-Bernal, Benjamin A. Suter, Samuel A. Neymotin, Cliff C. Kerr, Adrian Quintana, Padraig Gleeson, Gordon M. G. Shepherd, William W. Lytton

P107 Inter-areal and inter-regional inhomogeneity in co-axial anisotropy of Cortical Point Spread in human visual areas

Juhyoung Ryu, Sang-Hun Lee

P108 Two bayesian quanta of uncertainty explain the temporal dynamics of cortical activity in the non-sensory areas during bistable perception

Joonwon Lee, Sang-Hun Lee

P109 Optimal and suboptimal integration of sensory and value information in perceptual decision making

Hyang Jung Lee, Sang-Hun Lee

P110 A Bayesian algorithm for phoneme Perception and its neural implementation

Daeseob Lim, Sang-Hun Lee

P111 Complexity of EEG signals is reduced during unconsciousness induced by ketamine and propofol

Jisung Wang, Heonsoo Lee

P112 Self-organized criticality of neural avalanche in a neural model on complex networks

Nam Jung, Le Anh Quang, Seung Eun Maeng, Tae Ho Lee, Jae Woo Lee

P113 Dynamic alterations in connection topology of the hippocampal network during ictal-like epileptiform activity in an in vitro rat model

Chang-hyun Park, Sora Ahn, Jangsup Moon, Yun Seo Choi, Juhee Kim, Sang Beom Jun, Seungjun Lee, Hyang Woon Lee

P114 Computational model to replicate seizure suppression effect by electrical stimulation

Sora Ahn, Sumin Jo, Eunji Jun, Suin Yu, Hyang Woon Lee, Sang Beom Jun, Seungjun Lee

P115 Identifying excitatory and inhibitory synapses in neuronal networks from spike trains using sorted local transfer entropy

Felix Goetze, Pik-Yin Lai

P116 Neural network model for obstacle avoidance based on neuromorphic computational model of boundary vector cell and head direction cell

Seonghyun Kim, Jeehyun Kwag

P117 Dynamic gating of spike pattern propagation by Hebbian and anti-Hebbian spike timing-dependent plasticity in excitatory feedforward network model

Hyun Jae Jang, Jeehyun Kwag

P118 Inferring characteristics of input correlations of cells exhibiting up-down state transitions in the rat striatum

Marko Filipović, Ramon Reig, Ad Aertsen, Gilad Silberberg, Arvind Kumar

P119 Graph properties of the functional connected brain under the influence of Alzheimer’s disease

Claudia Bachmann, Simone Buttler, Heidi Jacobs, Kim Dillen, Gereon R. Fink, Juraj Kukolja, Abigail Morrison

P120 Learning sparse representations in the olfactory bulb

Daniel Kepple, Hamza Giaffar, Dima Rinberg, Steven Shea, Alex Koulakov

P121 Functional classification of homologous basal-ganglia networks

Jyotika Bahuguna,Tom Tetzlaff, Abigail Morrison, Arvind Kumar, Jeanette Hellgren Kotaleski

P122 Short term memory based on multistability

Tim Kunze, Andre Peterson, Thomas Knösche

P123 A physiologically plausible, computationally efficient model and simulation software for mammalian motor units

Minjung Kim, Hojeong Kim

P125 Decoding laser-induced somatosensory information from EEG

Ji Sung Park, Ji Won Yeon, Sung-Phil Kim

P126 Phase synchronization of alpha activity for EEG-based personal authentication

Jae-Hwan Kang, Chungho Lee, Sung-Phil Kim

P129 Investigating phase-lags in sEEG data using spatially distributed time delays in a large-scale brain network model

Andreas Spiegler, Spase Petkoski, Matias J. Palva, Viktor K. Jirsa

P130 Epileptic seizures in the unfolding of a codimension-3 singularity

Maria L. Saggio, Silvan F. Siep, Andreas Spiegler, William C. Stacey, Christophe Bernard, Viktor K. Jirsa

P131 Incremental dimensional exploratory reasoning under multi-dimensional environment

Oh-hyeon Choung, Yong Jeong

P132 A low-cost model of eye movements and memory in personal visual cognition

Yong-il Lee, Jaeseung Jeong

P133 Complex network analysis of structural connectome of autism spectrum disorder patients

Su Hyun Kim, Mir Jeong, Jaeseung Jeong

P134 Cognitive motives and the neural correlates underlying human social information transmission, gossip

Jeungmin Lee, Jaehyung Kwon, Jerald D. Kralik, Jaeseung Jeong

P135 EEG hyperscanning detects neural oscillation for the social interaction during the economic decision-making

Jaehwan Jahng, Dong-Uk Hwang, Jaeseung Jeong

P136 Detecting purchase decision based on hyperfrontality of the EEG

Jae-Hyung Kwon, Sang-Min Park, Jaeseung Jeong

P137 Vulnerability-based critical neurons, synapses, and pathways in the Caenorhabditis elegans connectome

Seongkyun Kim, Hyoungkyu Kim, Jerald D. Kralik, Jaeseung Jeong

P138 Motif analysis reveals functionally asymmetrical neurons in C. elegans

Pyeong Soo Kim, Seongkyun Kim, Hyoungkyu Kim, Jaeseung Jeong

P139 Computational approach to preference-based serial decision dynamics: do temporal discounting and working memory affect it?

Sangsup Yoon, Jaehyung Kwon, Sewoong Lim, Jaeseung Jeong

P141 Social stress induced neural network reconfiguration affects decision making and learning in zebrafish

Choongseok Park, Thomas Miller, Katie Clements, Sungwoo Ahn, Eoon Hye Ji, Fadi A. Issa

P142 Descriptive, generative, and hybrid approaches for neural connectivity inference from neural activity data

JeongHun Baek, Shigeyuki Oba, Junichiro Yoshimoto, Kenji Doya, Shin Ishii

P145 Divergent-convergent synaptic connectivities accelerate coding in multilayered sensory systems

Thiago S. Mosqueiro, Martin F. Strube-Bloss, Brian Smith, Ramon Huerta

P146 Swinging networks

Michal Hadrava, Jaroslav Hlinka

P147 Inferring dynamically relevant motifs from oscillatory stimuli: challenges, pitfalls, and solutions

Hannah Bos, Moritz Helias

P148 Spatiotemporal mapping of brain network dynamics during cognitive tasks using magnetoencephalography and deep learning

Charles M. Welzig, Zachary J. Harper

P149 Multiscale complexity analysis for the segmentation of MRI images

Won Sup Kim, In-Seob Shin, Hyeon-Man Baek, Seung Kee Han

P150 A neuro-computational model of emotional attention

René Richter, Julien Vitay, Frederick Beuth, Fred H. Hamker

P151 Multi-site delayed feedback stimulation in parkinsonian networks

Kelly Toppin, Yixin Guo

P152 Bistability in Hodgkin–Huxley-type equations

Tatiana Kameneva, Hamish Meffin, Anthony N. Burkitt, David B. Grayden

P153 Phase changes in postsynaptic spiking due to synaptic connectivity and short term plasticity: mathematical analysis of frequency dependency

Mark D. McDonnell, Bruce P. Graham

P154 Quantifying resilience patterns in brain networks: the importance of directionality

Penelope J. Kale, Leonardo L. Gollo

P155 Dynamics of rate-model networks with separate excitatory and inhibitory populations

Merav Stern, L. F. Abbott

P156 A model for multi-stable dynamics in action recognition modulated by integration of silhouette and shading cues

Leonid A. Fedorov, Martin A. Giese

P157 Spiking model for the interaction between action recognition and action execution

Mohammad Hovaidi Ardestani, Martin Giese

P158 Surprise-modulated belief update: how to learn within changing environments?

Mohammad Javad Faraji, Kerstin Preuschoff, Wulfram Gerstner

P159 A fast, stochastic and adaptive model of auditory nerve responses to cochlear implant stimulation

Margriet J. van Gendt, Jeroen J. Briaire, Randy K. Kalkman, Johan H. M. Frijns

P160 Quantitative comparison of graph theoretical measures of simulated and empirical functional brain networks

Won Hee Lee, Sophia Frangou

P161 Determining discriminative properties of fMRI signals in schizophrenia using highly comparative time-series analysis

Ben D. Fulcher, Patricia H. P. Tran, Alex Fornito

P162 Emergence of narrowband LFP oscillations from completely asynchronous activity during seizures and high-frequency oscillations

Stephen V. Gliske, William C. Stacey, Eugene Lim, Katherine A. Holman, Christian G. Fink

P163 Neuronal diversity in structure and function: cross-validation of anatomical and physiological classification of retinal ganglion cells in the mouse

Jinseop S. Kim, Shang Mu, Kevin L. Briggman, H. Sebastian Seung, the EyeWirers

P164 Analysis and modelling of transient firing rate changes in area MT in response to rapid stimulus feature changes

Detlef Wegener, Lisa Bohnenkamp, Udo A. Ernst

P165 Step-wise model fitting accounting for high-resolution spatial measurements: construction of a layer V pyramidal cell model with reduced morphology

Tuomo Mäki-Marttunen, Geir Halnes, Anna Devor, Christoph Metzner, Anders M. Dale, Ole A. Andreassen, Gaute T. Einevoll

P166 Contributions of schizophrenia-associated genes to neuron firing and cardiac pacemaking: a polygenic modeling approach

Tuomo Mäki-Marttunen, Glenn T. Lines, Andy Edwards, Aslak Tveito, Anders M. Dale, Gaute T. Einevoll, Ole A. Andreassen

P167 Local field potentials in a 4 × 4 mm2 multi-layered network model

Espen Hagen, Johanna Senk, Sacha J. van Albada, Markus Diesmann

P168 A spiking network model explains multi-scale properties of cortical dynamics

Maximilian Schmidt, Rembrandt Bakker, Kelly Shen, Gleb Bezgin, Claus-Christian Hilgetag, Markus Diesmann, Sacha Jennifer van Albada

P169 Using joint weight-delay spike-timing dependent plasticity to find polychronous neuronal groups

Haoqi Sun, Olga Sourina, Guang-Bin Huang, Felix Klanner, Cornelia Denk

P170 Tensor decomposition reveals RSNs in simulated resting state fMRI

Katharina Glomb, Adrián Ponce-Alvarez, Matthieu Gilson, Petra Ritter, Gustavo Deco

P171 Getting in the groove: testing a new model-based method for comparing task-evoked vs resting-state activity in fMRI data on music listening

Matthieu Gilson, Maria AG Witek, Eric F. Clarke, Mads Hansen, Mikkel Wallentin, Gustavo Deco, Morten L. Kringelbach, Peter Vuust

P172 STochastic engine for pathway simulation (STEPS) on massively parallel processors

Guido Klingbeil, Erik De Schutter

P173 Toolkit support for complex parallel spatial stochastic reaction–diffusion simulation in STEPS

Weiliang Chen, Erik De Schutter

P174 Modeling the generation and propagation of Purkinje cell dendritic spikes caused by parallel fiber synaptic input

Yunliang Zang, Erik De Schutter

P175 Dendritic morphology determines how dendrites are organized into functional subunits

Sungho Hong, Akira Takashima, Erik De Schutter

P176 A model of Ca2+/calmodulin-dependent protein kinase II activity in long term depression at Purkinje cells

Criseida Zamora, Andrew R. Gallimore, Erik De Schutter

P177 Reward-modulated learning of population-encoded vectors for insect-like navigation in embodied agents

Dennis Goldschmidt, Poramate Manoonpong, Sakyasingha Dasgupta

P178 Data-driven neural models part II: connectivity patterns of human seizures

Philippa J. Karoly, Dean R. Freestone, Daniel Soundry, Levin Kuhlmann, Liam Paninski, Mark Cook

P179 Data-driven neural models part I: state and parameter estimation

Dean R. Freestone, Philippa J. Karoly, Daniel Soundry, Levin Kuhlmann, Mark Cook

P180 Spectral and spatial information processing in human auditory streaming

Jaejin Lee, Yonatan I. Fishman, Yale E. Cohen

P181 A tuning curve for the global effects of local perturbations in neural activity: Mapping the systems-level susceptibility of the brain

Leonardo L. Gollo, James A. Roberts, Luca Cocchi

P182 Diverse homeostatic responses to visual deprivation mediated by neural ensembles

Yann Sweeney, Claudia Clopath

P183 Opto-EEG: a novel method for investigating functional connectome in mouse brain based on optogenetics and high density electroencephalography

Soohyun Lee, Woo-Sung Jung, Jee Hyun Choi

P184 Biphasic responses of frontal gamma network to repetitive sleep deprivation during REM sleep

Bowon Kim, Youngsoo Kim, Eunjin Hwang, Jee Hyun Choi

P185 Brain-state correlate and cortical connectivity for frontal gamma oscillations in top-down fashion assessed by auditory steady-state response

Younginha Jung, Eunjin Hwang, Yoon-Kyu Song, Jee Hyun Choi

P186 Neural field model of localized orientation selective activation in V1

James Rankin, Frédéric Chavane

P187 An oscillatory network model of Head direction and Grid cells using locomotor inputs

Karthik Soman, Vignesh Muralidharan, V. Srinivasa Chakravarthy

P188 A computational model of hippocampus inspired by the functional architecture of basal ganglia

Karthik Soman, Vignesh Muralidharan, V. Srinivasa Chakravarthy

P189 A computational architecture to model the microanatomy of the striatum and its functional properties

Sabyasachi Shivkumar, Vignesh Muralidharan, V. Srinivasa Chakravarthy

P190 A scalable cortico-basal ganglia model to understand the neural dynamics of targeted reaching

Vignesh Muralidharan, Alekhya Mandali, B. Pragathi Priyadharsini, Hima Mehta, V. Srinivasa Chakravarthy

P191 Emergence of radial orientation selectivity from synaptic plasticity

Catherine E. Davey, David B. Grayden, Anthony N. Burkitt

P192 How do hidden units shape effective connections between neurons?

Braden A. W. Brinkman, Tyler Kekona, Fred Rieke, Eric Shea-Brown, Michael Buice

P193 Characterization of neural firing in the presence of astrocyte-synapse signaling

Maurizio De Pittà, Hugues Berry, Nicolas Brunel

P194 Metastability of spatiotemporal patterns in a large-scale network model of brain dynamics

James A. Roberts, Leonardo L. Gollo, Michael Breakspear

P195 Comparison of three methods to quantify detection and discrimination capacity estimated from neural population recordings

Gary Marsat, Jordan Drew, Phillip D. Chapman, Kevin C. Daly, Samual P. Bradley

P196 Quantifying the constraints for independent evoked and spontaneous NMDA receptor mediated synaptic transmission at individual synapses

Sat Byul Seo, Jianzhong Su, Ege T. Kavalali, Justin Blackwell

P199 Gamma oscillation via adaptive exponential integrate-and-fire neurons

LieJune Shiau, Laure Buhry, Kanishka Basnayake

P200 Visual face representations during memory retrieval compared to perception

Sue-Hyun Lee, Brandon A. Levy, Chris I. Baker

P201 Top-down modulation of sequential activity within packets modeled using avalanche dynamics

Timothée Leleu, Kazuyuki Aihara

Q28 An auto-encoder network realizes sparse features under the influence of desynchronized vascular dynamics

Ryan T. Philips, Karishma Chhabria, V. Srinivasa Chakravarthy

A1 Functional advantages of cell-type heterogeneity in neural circuits

Tatyana O. Sharpee1

1Computational Neurobiology Laboratory, The Salk Institute for Biological Studies, San Diego, CA, USA

Correspondence: Tatyana O. Sharpee - sharpee@snl.salk.edu

BMC Neuroscience 2016, 17(Suppl 1):A1

Neural circuits are notorious for the complexity of their organization. Part of this complexity is related to the number of different cell types that work together to encode stimuli. I will discuss theoretical results that point to functional advantages of splitting neural populations into subtypes, both in feedforward and recurrent networks. These results outline a framework for categorizing neuronal types based on their functional properties. Such classification scheme could augment classification schemes based on molecular, anatomical, and electrophysiological properties.

A2 Mesoscopic modeling of propagating waves in visual cortex

Alain Destexhe1,2

1UNIC, CNRS, Gif sur Yvette, France; 2The European Institute for Theoretical Neuroscience (EITN), Paris, France

Correspondence: Alain Destexhe - destexhe@unic.cnrs-gif.fr

BMC Neuroscience 2016, 17(Suppl 1):A2

Propagating waves are large-scale phenomena widely seen in the nervous system, in both anesthetized and awake or sleeping states. Recently, the presence of propagating waves at the scale of microns–millimeters was demonstrated in the primary visual cortex (V1) of macaque monkey. Using a combination of voltage-sensitive dye (VSD) imaging in awake monkey V1 and model-based analysis, we showed that virtually every visual input is followed by a propagating wave (Muller et al., Nat Comm 2014). The wave was confined within V1, and was consistent and repeatable for a given input. Interestingly, two propagating waves always interact in a suppressive fashion, and sum sublinearly. This is in agreement with the general suppressive effect seen in other circumstances in V1 (Bair et al., J Neurosci 2003; Reynaud et al., J Neurosci 2012).

To investigate possible mechanisms for this suppression we have designed mean-field models to directly integrate the VSD experiments. Because the VSD signal is primarily caused by the summed voltage of all membranes, it represents an ideal case for mean-field models. However, usual mean-field models are based on neuronal transfer functions such as the well-known sigmoid function, or functions estimated from very simple models. Any error in the transfer function may result in wrong predictions by the corresponding mean-field model. To palliate this caveat, we have obtained semi-analytic forms of the transfer function of more realistic neuron models. We found that the same mathematical template can capture the transfer function for models such as the integrate-and-fire (IF) model, the adaptive exponential (AdEx) model, up to Hodgkin–Huxley (HH) type models, all with conductance-based inputs.

Using these transfer functions we have built “realistic” mean-field models for networks with two populations of neurons, the regular-spiking (RS) excitatory neurons, showing spike frequency adaptation, and the fast-spiking (FS) inhibitory neurons. This mean-field model can reproduce the propagating waves in V1, due to horizontal interactions, as shown previously using IF networks. This mean-field model also reproduced the suppressive interactions between propagating waves. The mechanism of suppression was based on the preferential recruitment of inhibitory cells over excitatory cells by afferent activity, which acted through the conductance-based shunting effect of the two waves onto one another. The suppression was negligible in networks with identical models for excitatory and inhibitory cells (such as IF networks). This suggests that the suppressive effect is a general phenomenon due to the higher excitability of inhibitory neurons in cortex, in line with previous models (Ozeki et al., Neuron 2009).

Work done in collaboration with Yann Zerlaut (UNIC) for modeling, Sandrine Chemla and Frederic Chavane (CNRS, Marseille) for in vivo experiments. Supported by CNRS and the European Commission (Human Brain Project).

A3 Dynamics and biomarkers of mental disorders

Mitsuo Kawato1

1ATR Computational Neuroscience Laboratories, 2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan

Correspondence: Mitsuo Kawato - kawato@hip.atr.co.jp

BMC Neuroscience 2016, 17(Suppl 1):A3

Current diagnoses of mental disorders are made in a categorical way, as exemplified by DSM-5, but many difficulties have been encountered in such categorical regimes: the high percentage of comorbidities, usage of the same drug for multiple disorders, the lack of any validated animal model, and the situation where no epoch-making drug has been developed in the past 30 years. NIMH started RDoC (research domain criterion) to overcome these problems [1], and some successful results have been obtained, including common genetic risk loci [2] and common neuroanatomical changes for multiple disorders [3] as well as psychosis biotypes [4].

In contrast to the currently dominant molecular biology approach, which basically assumes one-to-one mapping between genes and disorders, I postulate the following dynamics-based view of psychiatric disorders. Our brain is a nonlinear dynamical system that can generate spontaneous spatiotemporal activities. The dynamical system is characterized by multiple stable attractors, only one of which corresponds to a healthy or typically developed state. The others are pathological states.

The most promising research approach within the above dynamical view is to combine resting-state functional magnetic resonance imaging, machine learning, big data, and sophisticated neurofeedback. Yahata et al. developed an ASD biomarker using only 16/9730 functional connections, and it did not generalize to MDD or ADHD but moderately to schizophrenia [5]. Yamashita’s regression model of working memory ability from functional connections [6] generalized to schizophrenia and reproduced the severity of working-memory deficits of four psychiatric disorders (in preparation).

With the further development of machine learning algorithms and accumulation of reliable datasets, we hope to obtain a comprehensive landscape of many psychiatric and neurodevelopmental disorders. Guided by this full-spectrum structure, a tailor-made neurofeedback therapy should be optimized for each patient [7].

  1. 1.

    Insel T, Cuthbert B, Garvey M., et al. Research domain criteria (RDoC): toward a new classification framework for research on mental disorders. Am J Psychiatry. 2010;167:748–51.

  2. 2.

    Cross-disorder group of the psychiatric genomics consortium: identification of risk loci with shared effects on five major psychiatric disorders: a genome-wide analysis. Lancet. 2013;381:1371–9.

  3. 3.

    Goodkind M, et al. Identification of a common neurobiological substrate for mental illness. JAMA Psychiatry. 2015;72:305–15.

  4. 4.

    Clementz BA, et al. Identification of distinct psychosis biotypes using brain-based biomarkers. Am J Psychiatry. 2016;173:373–84.

  5. 5.

    Yahata N, Morimoto J, Hashimoto R, Lisi G, Shibata K, Kawakubo Y, Kuwabara H, Kuroda M, Yamada T, Megumi F, Imamizu H, Nanez JE, Takahashi H, Okamoto Y, Kasai K, Kato N, Sasaki Y, Watanabe T, Kawato M: A small number of abnormal brain connections predicts adult autism spectrum disorder. Nature Commun. 2016;7:11254. doi:10.1038/ncomms11254.

  6. 6.

    Yamashita M, Kawato M, Imamizu H. Predicting learning plateau of working memory from whole-brain intrinsic network connectivity patterns. Sci Rep. 2015;5(7622). doi:10.1038/srep07622.

  7. 7.

    ATR Brain Information Communication Research Laboratory Group. DecNef 
Project. Available at http://www.cns.atr.jp/decnefpro/ (2016).


F1 Precise recruitment of spiking output at theta frequencies requires dendritic h-channels in multi-compartment models of oriens-lacunosum/moleculare hippocampal interneurons

Vladislav Sekulić1,2, Frances K. Skinner1,2,3

1Krembil Research Institute, University Health Network, Toronto, Ontario, Canada, M5T 2S8; 2Department of Physiology, University of Toronto, Toronto, Ontario, Canada, M5S 1A8; 3 Department of Medicine (Neurology), University of Toronto, Toronto, Ontario, Canada, M5T 2S8

Correspondence: Vladislav Sekulić - vlad.sekulic@utoronto.ca

BMC Neuroscience 2016, 17(Suppl 1):F1

The theta rhythm (4–12 Hz) is a prominent network oscillation observed in the mammalian hippocampus and is correlated with spatial navigation and mnemonic processing. Inhibitory interneurons of the hippocampus fire action potentials at specific phases of the theta rhythm, pointing to distinct functional roles of interneurons in shaping this rhythmic activity. One hippocampal interneuron type, the oriens-lacunosum/moleculare (O-LM) cell, provides direct feedback inhibition and regulation of pyramidal cell activity in the CA1 region. O-LM cells express the hyperpolarization-activated, mixed-cation current (I h) and, in vitro, demonstrate spontaneous firing at theta that is impaired upon blockade of I h. Work using dynamic clamp has shown that in the presence of frequency-modulated artificial synaptic inputs, O-LM cells exhibit a spiking resonance at theta frequencies that is not dependent on I h [1]. However, due to the somatic injection limitation of dynamic clamp, the study could not examine the potential contributions of putative dendritic I h or the integration of dendritically-located synaptic inputs. To overcome this, we have used a database of previously developed multi-compartment computational models of O-LM cells [2].

We situated our OLM cell models in an in vivo-like context by injecting Poisson-based synaptic background activities throughout their dendritic arbors. Excitatory and inhibitory synaptic weights were tuned to produce similar baseline activity prior to modulation of the inhibitory synaptic process at various frequencies (2–30 Hz). We found that models with dendritic inputs expressed enhanced resonant firing at theta frequencies compared to models with somatic inputs. We then performed detailed analyses on the outputs of the models with dendritic inputs to further elucidate these results with respect to I h distributions. The ability of the models to be recruited at the modulated input frequencies was quantified using the rotation number, or average number of spikes across all input cycles. Models with somatodendritic I h were recruited at >50 % of the input cycles for a wider range of theta frequencies (3–9 Hz) compared to models with somatic I h only (3–4 Hz). Models with somatodendritic I h also exhibited a wider range of theta frequencies for which phase-locked output (vector strength >0.75) was observed (4–12 Hz), compared to models with somatic I h (3–5 Hz). Finally, the phase of firing of models with somatodendritic I h given 8–10 Hz modulated input was delayed 180–230° relative to the time of release from inhibitory synaptic input.

O-LM cells receive phasic inhibitory inputs at theta frequencies from a subpopulation of parvalbumin-positive GABAergic interneurons in the medial septum (MS) timed to the peak of hippocampal theta, as measured in the stratum pyramidale layer [3]. Furthermore, O-LM cells fire at the trough of hippocampal pyramidal layer theta in vivo [4], an approximate 180˚ phase delay from the MS inputs, corresponding to the phase delay in our models with somatodendritic I h. Our results suggest that, given dendritic synaptic inputs, O-LM cells require somatodendritic I h channel expression to be precisely recruited during the trough of hippocampal theta activity. Our strategy of leveraging model databases that encompass experimental cell type-specificity and variability allowed us to reveal critical biophysical factors that contribute to neuronal function within in vivo-like contexts.

Acknowledgements: Supported by NSERC of Canada, an Ontario Graduate Scholarship, and the SciNet HPC Consortium.

  1. 1.

    Kispersky TJ, Fernandez FR, Economo MN, White JA. Spike resonance properties in hippocampal O-LM cells are dependent on refractory dynamics. J Neurosci. 2012;32(11):3637–51.

  2. 2.

    Sekulić V, Lawrence JJ, Skinner FK. Using multi-compartment ensemble modeling as an investigative tool of spatially distributed biophysical balances: application to hippocampal oriens-lacunosum/moleculare (O-LM) cells. PLOS One. 2014;9(10):e106567.

  3. 3.

    Borhegyi Z, Varga V, Szilágyi, Fabo D, Freund TF. Phase segregation of medial septal GABAergic neurons during hippocampal theta activity. J Neurosci. 2004;24(39):8470–9.

  4. 4.

    Varga C, Golshani P, Soltesz I. Frequency-invariant temporal ordering of interneuronal discharges during hippocampal oscillations in awake mice. Proc Natl Acad Sci USA. 2012;109(40):E2726–34.


F2 Kernel methods in reconstruction of current sources from extracellular potentials for single cells and the whole brains

Daniel K. Wójcik1, Chaitanya Chintaluri1, Dorottya Cserpán2, Zoltán Somogyvári2

1Department of Neurophysiology, Nencki Institute of Experimental Biology, Warsaw, Poland; 2Department of Theory, Wigner Research Centre for Physics of the Hungarian Academy of Sciences, Budapest, H-1121, Hungary

Correspondence: Daniel K. Wójcik - d.wojcik@nencki.gov.pl

BMC Neuroscience 2016, 17(Suppl 1):F2

Extracellular recordings of electric potential, with a century old history, remain a popular tool for investigations of brain activity on all scales, from single neurons, through populations, to the whole brains, in animals and humans, in vitro and in vivo [1]. The specific information available in the recording depends on the physical settings of the system (brain + electrode). Smaller electrodes are usually more selective and are used to capture local information (spikes from single cells or LFP from populations) while larger electrodes are used for subdural recordings (on the cortex, ECoG), on the scalp (EEG) but also as depth electrodes in humans (called SEEG). The advantages of extracellular electric potential are the ease of recording and its stability. Its problem is interpretation: since electric field is long range one can observe neural activity several millimeters from its source [2–4]. As a consequence every recording reflects activity of many cells, populations and regions, depending on which level we focus. One way to overcome this problem is to reconstruct the distribution of current sources (CSD) underlying the measurement [5], typically done to identify activity on systems level from multiple LFP on regular grids [6].

We recently proposed a kernel-based method of CSD estimation from multiple LFP recordings from arbitrarily placed probes (i.e. not necessarily on a grid) which we called kernel Current Source Density method (kCSD) [7]. In this overview we present the original proposition as well as two recent developments, skCSD (single cell kCSD) and kESI (kernel Electrophysiological Source Imaging). skCSD assumes that we know which part of the recorded signal comes from a given cell and we have access to the morphology of the cell. This could be achieved by patching a cell, driving it externally while recording the potential on a multielectrode array, injecting a dye, and reconstructing the morphology. In this case we know that the sources must be located on the cell and this information can be successfully used in estimation. In kESI we consider simultaneous recordings with subdural ECoG (strip and grid electrodes) and with depth electrodes (SEEG). Such recordings are taken on some epileptic patients prepared for surgical removal of epileptogenic zone. When MR scan of the patient head is taken and the positions of the electrodes are known as well as the brain’s shape, the idea of kCSD can be used to bound the possible distribution of sources facilitating localization of the foci.

Acknowledgements: Polish Ministry for Science and Higher Education (grant 2948/7.PR/2013/2), Hungarian Scientific Research Fund (Grant OTKA K113147), National Science Centre, Poland (Grant 2015/17/B/ST7/04123).

  1. 1.

    Buzsáki G, Anastassiou CA, Koch C. The origin of extracellular fields and currents—EEG, ECoG, LFP and spikes. Nat Rev Neurosci. 2012;13:407–20.

  2. 2.

    Hunt MJ, Falinska M, Łęski S, Wójcik DK, Kasicki S. Differential effects produced by ketamine on oscillatory activity recorded in the rat hippocampus, dorsal striatum and nucleus accumbens. J Psychopharmacol. 2011;25:808–21.

  3. 3.

    Lindén H, Tetzlaff T, Potjans TC, Pettersen KH, Gruen S, Diesmann M, Einevoll GT. Modeling the spatial reach of the LFP. Neuron. 2011;72:859–72..

  4. 4.

    Łęski S, Lindén H, Tetzlaff T, Pettersen KH, Einevoll GT. Frequency dependence of signal power and spatial reach of the local field potential. PLoS Comput Biol. 2013;9:e1003137.

  5. 5.

    Wójcik DK. Current source density (CSD) analysis. In: Jaeger D, Jung R, editors. Encyclopedia of computational neuroscience. SpringerReference. Berlin: Springer; 2013.

  6. 6.

    Mitzdorf U. Current source-density method and application in cat cerebral cortex: investigation of evoked potentials and EEG phenomena. Physiol Rev. 1985;65:37–100.

  7. 7.

    Potworowski J, Jakuczun W, Łęski S, Wójcik DK. Kernel current source density method. Neural Comput. 2012;24:541–75.


F3 The synchronized periods depend on intracellular transcriptional repression mechanisms in circadian clocks

Jae Kyoung Kim1, Zachary P. Kilpatrick2, Matthew R. Bennett3, Kresimir Josić2,4

1Department of Mathematical Sciences, KAIST, Daejoen 34141, Republic of Korea; 2Department of Mathematics, University of Houston, Houston, TX 77004, USA; 3Department of Biochemistry and Cell Biology and Institute of Biosciences and Bioengineering, Rice University, Houston, TX 77005, USA; 4Department of Biology and Biochemistry, University of Houston, Houston, TX 77004, USA

Correspondence: Jae Kyoung Kim - jaekkim@kaist.ac.kr

BMC Neuroscience 2016, 17(Suppl 1):F2

In mammals, circadian (~24 h) rhythms are mainly regulated by a master circadian clock located in the suprachiasmatic nucleus (SCN) [1]. The SCN consists of ~20,000 neurons, each of which generates own rhythms via intracellular transcriptional negative feedback loop involving PER-CRY and BMAL1-CLOCK. These individual rhythms of each neuron are synchronized through intercellular coupling via neurotransmitters including VIP [2]. In this talk, I will discuss that the synchronized periods via coupling signal strongly depend on the mechanism of intracellular transcription repression [3–4]. Specifically, using mathematical modeling and phase response curve analysis, we find that the synchronized period of SCN stays close to the population mean of cells’ intrinsic periods (~24 h) if transcriptional repression occurs via protein sequestration. However, the synchronized period is far from the population mean when repression occurs via Hill-type regulation (e.g. phosphorylation-based repression). These results reveal the novel relationship between two major functions of the SCN-intracellular rhythm generation and intercellular synchronization of rhythms. Furthermore, this relationship provides an explanation for why the protein sequestration is commonly used in circadian clocks of multicellular organisms, which have a coupled master clock, but not in unicellular organisms [4].

Acknowledgements: This work was funded by the National Institutes of Health, through the joint National Science Foundation/National Institute of General Medical Sciences Mathematical Biology Program grant No. R01GM104974 (to M.R.B. and K.J.), National Science Foundation grants Nos. DMS-1311755 (to Z.P.K.) and DMS-1122094 (to K.J.), the Robert A. Welch Foundation grant No. C-1729 (to M.R.B.), National Science Foundation grant No. DMS-0931642 to the Mathematical Biosciences Institute (to J.K.K.), KAIST Research Allowance Grant G04150020 (to J.K.K) and the TJ Park Science Fellowship of POSCO TJ Park Foundation G01160001 (to J.K.K).

  1. 1.

    Dibner C, Schibler U, Albrecht U. The mammalian circadian timing system: organization and coordination of central and peripheral clocks. Annu Rev Physiol. 2010;72:517–49.

  2. 2.

    Welsh DK, Takahashi JS, Kay SA. Suprachiasmatic nucleus: cell autonomy and network properties. Annu Rev Physiol. 2010;72:551.

  3. 3.

    Kim JK, Kilpatrick ZP, Bennett MR, Josić K. Molecular mechanisms that regulate the coupled period of the mammalian circadian clock. Biophys J. 2014;106(9):2071–81.

  4. 4.

    Kim JK. Protein sequestration vs Hill-type repression in circadian clock models (in revision).


O1 Assessing irregularity and coordination of spiking-bursting rhythms in central pattern generators

Irene Elices1, David Arroyo1, Rafael Levi1,2, Francisco B. Rodriguez1, Pablo Varona1

1Grupo de Neurocomputación Biológica, Dpto. de Ingeniería Informática, Escuela Politécnica Superior, Universidad Autónoma de Madrid, Spain; 2Department of Biological Sciences, University of Southern California, CA, USA

Correspondence: Irene Elices - irene.elices@uam.es

BMC Neuroscience 2016, 17(Suppl 1):O1

Found in all nervous systems, central pattern generators (CPGs) are neural circuits that produce flexible rhythmic motor patterns. Their robust and highly coordinated spatio-temporal activity is generated in the absence of rhythmic input. Several invertebrate CPGs are among the best known neural circuits, as their neurons and connections have been identified and mapped. The crustacean pyloric CPG is one of these flagship neural networks [1, 2]. Experimental and computational studies of CPGs typically examine their rhythmic output in periodic spiking-bursting regimes. Aiming to understand the fast rhythm negotiation of CPG neurons, here we present experimental and theoretical analyses of the pyloric CPG activity in situations where irregular yet coordinated rhythms are produced. In particular, we focus our study in the context of two sources of rhythm irregularity: intrinsic damage in the preparation, and irregularity induced by ethanol. The analysis of non-periodic regimes can unveil important properties of the robust dynamics controlling rhythm coordination in this system.

Adult male and female shore crabs (Carcinus maenas) were used for the experimental recordings. The isolated stomatrogastric ganglion was kept in Carcinus maenas saline. Membrane potentials were recorded intracellularly from the LP and PD cells, two mutually inhibitory neurons that form a half-center oscillator in the pyloric CPG. Extracellular electrodes allowed monitoring the overall CPG rhythm. Conductance-based models of the pyloric CPG neurons and their associated graded synapses as described in [3, 4] were also used in this dual experimental and theoretical study.

Irregularity and coordination of the CPG rhythms were analyzed using measures characterizing the cells’ instantaneous waveform, period, duty cycle, plateau, hyperpolarization and temporal structure of the spiking activity, as well as measures describing instantaneous phases among neurons in the irregular rhythms and their variability. Our results illustrate the strong robustness of the circuit to keep LP/PD phase relationships in intrinsic and induced irregularity conditions while allowing a large variety of burst waveforms, durations and hyperpolarization periods in these neurons. In spite of being electrically coupled to the pacemaker cell of the circuit, the PD neurons showed a wide flexibility to participate with larger burst durations in the CPG rhythm (and larger increase in variability), while the LP neuron was more restricted in sustaining long bursts in the conditions analyzed. The conductance-based models were used to explain the role of asymmetry in the dynamics of the neurons and synapses to shape the irregular activity observed experimentally. Taking into account the overall experimental and model analyses, we discuss the presence of preserved relationships in the non-periodic but coordinated bursting activity of the pyloric CPG, and their role in the fast rhythm negotiating properties of this circuit.

Acknowledgements: We acknowledge support from MINECO DPI2015-65833-P, TIN2014-54580-R, TIN-2012-30883 and ONRG grant N62909-14-1-N279.

  1. 1.

    Marder E, Calabrese RL. Principles of rhythmic motor pattern generation. Physiol Rev. 1996;76:687–717.

  2. 2.

    Selverston AI, Rabinovich MI, Abarbanel HDI, Elson R, Szücs A, Pinto RD, Huerta R, Varona P. Reliable circuits from irregular neurons: a dynamical approach to understanding central pattern generators. J Physiol. 2000;94:357–74.

  3. 3.

    Latorre R, Rodríguez FB, Varona P. Neural signatures: multiple coding in spiking-bursting cells. Biol Cybern. 2006;95:169–83.

  4. 4.

    Elices I, Varona P. Closed-loop control of a minimal central pattern generator network. Neurocomputing. 2015;170:55–62.


O2 Regulation of top-down processing by cortically-projecting parvalbumin positive neurons in basal forebrain

Eunjin Hwang1, Bowon Kim1,2, Hio-Been Han1,3, Tae Kim4, James T. McKenna5, Ritchie E. Brown5, Robert W. McCarley5, Jee Hyun Choi1,2

1Center for Neuroscience, Korea Institute of Science and Technology, Hwarang-ro 14-gil 5, Seongbuk-gu, Seoul 02792, South Korea; 2Department of Neuroscience, University of Science and Technology, 217 Gajeong-ro, Yuseong-gu, Daejon 34113, South Korea; 3Department of Psychology, Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 03722, South Korea; 4Department of Psychiatry, Kyung Hee University Hospital at Gangdong, 892, Dongnam-ro, Gangdong-gu, Seoul 05278, South Korea; 5Department of Psychiatry, Veterans Administration Boston Healthcare System and Harvard Medical School, Brockton, MA 02301, USA

Correspondence: Jee Hyun Choi - jeechoi@kist.re.kr

BMC Neuroscience 2016, 17(Suppl 1):O2

Particular behaviors are associated with different spatio-temporal patterns of cortical EEG oscillations. A recent study suggests that the cortically-projecting, parvalbumin-positive (PV+) inhibitory neurons in the basal forebrain (BF) play an important role in the state-dependent control of cortical oscillations, especially ~40 Hz gamma oscillations [1]. However, the cortical topography of the gamma oscillations which are controlled by BF PV+ neurons and their relationship to behavior are unknown. Thus, in this study, we investigated the spatio-temporal patterns and the functional role of the cortical oscillations induced or entrained by BF PV+ neurons by combining optogenetic stimulation of BF PV+ neurons with high-density EEG [2, 3] in channelrhodopsin-2 (ChR2) transduced PV-cre mice. First, we recorded the spatio-temporal responses in the cortex with respect to the stimulation of BF PV+ neurons at various frequencies. The topographic response patterns were distinctively different depending on the stimulation frequencies, and most importantly, stimulation of BF PV+ neurons at 40 Hz (gamma band frequency) induced a preferential enhancement of gamma band oscillations in prefrontal cortex (PFC) with a statistically significant increase in intracortical connectivity within PFC. Second, optogenetic stimulation of BF PV+ neurons was applied while the mice were exposed to auditory stimuli (AS) at 40 Hz. The time delay between optogenetic stimulation and AS was tested and the phase response to the AS was characterized. We found that the phase responses to the click sound in PFC were modulated by the optogenetic stimulation of BF PV+ neurons. More specifically, the advanced activation of BF PV+ neurons by π/2 (6.25 ms) with respect to AS sharpened the phase response to AS in PFC, while the anti-phasic activation (π, 12.5 ms) blunted the phase response. Interestingly, like PFC, the primary auditory cortex (A1) also showed sharpened phase response for the π/2 advanced optogenetic BF PV+ neuron activation during AS. Considering that no direct influence of BF PV+ neurons on A1 was apparent in the response to stimulation of BF PV+ neurons alone, the sharpened phase response curve of A1 suggests a top-down influence of the PFC. This result implies that the BF PV+ neurons may participate in regulating the top-down influence that PFC exerts on primary sensory cortices during attentive behaviors, and supports the idea that the modulating activities of BF PV+ neurons might be a potential target for restoring top-down cognitive functions as well as abnormal frontal gamma oscillations associated with psychiatric disorders.

Acknowledgements: This research was supported by the Department of Veterans Affairs, the Korean National Research Council of Science & Technology (No. CRC-15-04-KIST), NIMH R01 MH039683 and Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2015R1D1A1A01059119). The contents of this report do not represent the views of the US Department of Veterans Affairs or the United States government.

  1. 1.

    Kim T, et al. Cortically projecting basal forebrain parvalbumin neurons regulate cortical gamma band oscillations. Proc Natl Acad Sci. 2015;112(11):3535–40.

  2. 2.

    Choi JH, et al. High resolution electroencephalography in freely moving mice. J Neurophysiol .2010;104(3):1825–34.

  3. 3.

    Lee M, et al. High-density EEG recordings of the freely moving mice using polyimide-based microelectrode. J Vis Exp. 2011;47. http://www.jove.com/details.php?id=2562. doi:10.3791/2562.


O3 Modeling auditory stream segregation, build-up and bistability

James Rankin1, Pamela Osborn Popp1, John Rinzel1,2

1Center for Neural Science, New York University, New York 10003, NY; 2Courant Institute of Mathematical Sciences, New York University, New York 10012, NY

Correspondence: James Rankin - james.rankin@nyu.edu

BMC Neuroscience 2016, 17(Suppl 1):O3

With neuromechanistic modelling and psychoacoustic experiments we study the perceptual dynamics of auditory streaming (cocktail party problem). The stimulus is a sequence of two interleaved tones, A and B in a repeating triplet pattern: ABA_ABA_ (‘_’ is a silent gap). Initially, subjects hear a single integrated pattern, but after some seconds they hear segregated A_A_A_ and _B___B__ streams (build-up of streaming segregation). For long presentations, build-up is followed by irregular alternations between integrated and segregated (auditory bistability). We recently presented [1] the first neuromechanistic model of auditory bistability; it incorporates common competition mechanisms of mutual inhibition, slow adaptation and noise [2]. Our competition network is formulated to reside downstream of primary auditory cortex (A1). Neural responses in macaque A1 to triplet sequences [3] encode stimulus features and provide the inputs to our network (Fig. 1A). In our model recurrent excitation with an NMDA-like timescale links responses across gaps between tones and between triplets. It captures the dynamics of perceptual alternations and the stimulus feature dependence of percept durations. To account for build-up we incorporate early adaptation of A1 responses [3] (Fig. 1B, upper). Early responses in A1 are broadly tuned and do not reflect the frequency difference between the tones; later responses show a clear tonotopic dependence. This adaptation biases the initial percept towards integration, but occurs faster (~0.5 s) than the gradual build-up process (~5–10 s). The low initial probability of segregation gradually builds up to the stable probability of later bistable alternations (Fig. 1B, lower). During build-up, a pause in presentation may cause partial reset to integrated [4]. Our extended model shows this behavior assuming that after a pause A1 responses recover on the timescale of early adaptation. Moreover, the modeling results agree with our psychoacoustic experiments (compare filled and open circles in Fig. 1B, lower).
Fig. 1

A Model schematic: tone inputs IA and IB elicit pulsatile responses in A1, which are pooled as inputs to a three-population competition network. Central unit AB encodes integrated, peripheral units A and B encode segregated. Mutual inhibition between units and recurrent excitation are incorporated with adaptation and noise. B A1 inputs show early initial adaptation, also if a pause is present. Build-up function shows proportion segregated increasing over time, here shown for three tone-frequency differences, DF, with no pause (dashed) or with a pause (solid curves). Time-snapshots from model (filled circles) agree with data (empty circles with SEM error bars, N = 8)

Conclusions For the first time, we offer an explanation of the discrepancy in the timescales of early A1 responses and the more gradual build-up process. Recovery of A1 responses can explain resetting for stimulus pauses. Our model offers, to date, the most complete account of the early and late dynamics for auditory streaming in the triplet paradigm.

  1. 1.

    Rankin J, Sussman E, Rinzel J. Neuromechanistic model of auditory bistability. PLoS Comput Biol. 2015;11:e1004555.

  2. 2.

    Shpiro A, Moreno-Bote R, Rubin N, Rinzel J. Balance between noise and adaptation in competition models of perceptual bistability. J Comp Neurosci. 2009;27:37–54.

  3. 3.

    Micheyl C, Tian B, Carlyon R, Rauschecker J. Perceptual organization of tone sequences in the auditory cortex of awake macaques. Neuron. 2005;48:139–48.

  4. 4.

    Beauvois MW, Meddis R. Time decay of auditory stream biasing. Percept Psychophys. 1997;59:81–6.


O4 Strong competition between tonotopic neural ensembles explains pitch-related dynamics of auditory cortex evoked fields

Alejandro Tabas1, André Rupp2,†, Emili Balaguer-Ballester1,3,†

1Faculty of Science and Technology, Bournemouth University, Bournemouth, England, UK; 2Heidelberg University, Baden-Württemberg, Germany; 3Bernstein Center for Computational Neuroscience, Heidelberg-Mannheim, Baden-Württemberg, Germany

Correspondence: Alejandro Tabas - atabas@bournemouth.ac.uk

Equal contribution

BMC Neuroscience 2016, 17(Suppl 1):O4

Auditory evoked fields (AEFs) observed in MEG experiments systematically present a transient deflection known as the N100 m, elicited around 100 ms after the tone onset in the antero-lateral Heschl’s Gyrus. The exact N100m’s latency is correlated with the perceived pitch of a wide range of stimulus [1, 2], suggesting that the transient component reflects the processing of pitch in auditory cortex. However, the biophysical substrate of such precise relationship remains an enigma. Existing models of pitch, focused on perceptual phenomena, did not explain the mechanism generating cortical evoked fields during pitch processing in biophysical detail. In this work, we introduce a model of interacting neural ensembles describing, for the first time to our knowledge, how cortical pitch processing gives rise to observed human neuromagnetic responses and why its latency strongly correlates with pitch.

To provide a realistic cortical input, we used a recent model of the auditory periphery and realistic subcortical processing stages. Subcortical processing was based on a delay-and-multiply operation carried out in cochlear nucleus and inferior colliculus [3], resulting in realistic patterns of neural activation in response to the stimulus periodicities. Subcortical activation is transformed into a tonotopic receptive-field-like representation [4] by a novel cortical circuit composed by functional blocks characterised by a best frequency. Each block consist of an excitatory and an inhibitory population, modelled using mean-field approximations [5]. Blocks interact with each other through local AMPA- and NMDA-driven excitation and GABA-driven global inhibition [5].

The excitation-inhibition competition of the cortical model describes a general pitch processing mechanism that explains the N100m deflection as a transient state in the cortical dynamics. The deflection is rapidly triggered by a rise in the activity elicited by the subcortical input, peaks after the inhibition overcomes the input, and stabilises when model dynamics reach equilibrium, around 100 ms after onset. As a direct consequence of the connectivity structure among blocks, the time necessary for the system to reach equilibrium depends on the encoded pitch of the tone. The model quantitatively predicts observed latencies of the N100m in agreement with available empirical data [1, 2] in a series of stimuli (see Fig. 2), suggesting that the mechanism potentially accounts for the N100 m dynamics.
Fig. 2

N100 m predictions in comparison with available data [1, 2] for a range of pure tones (A) and HCTs (B)

  1. 1.

    Seither-Preisler A, Patterson R, Krumbholz K, Seither S, Lütkenhöner B. Evidence of pitch processing in the N100 m component of the auditory evoked field. Hear Res. 2006;213(1–2):88–98.

  2. 2.

    Roberts TP, Ferrari P, Stufflebeam SM, Poeppel D. Latency of the auditory evoked neuromagnetic field components: stimulus dependence and insights toward perception. J Clin Neurophysiol. 2000;17(2):114–29.

  3. 3.

    Meddis R, O’Mard LP. Virtual pitch in a computational physiological model. J Acoust Soc Am. 2006;6:3861–9.

  4. 4.

    Balaguer-Ballester E, Clark, N. Understanding pitch perception as a hierarchical process with top-down modulation. PLoS Comput Biol. 2009;5(3):e1000301.

  5. 5.

    Wong K-F, Wang X-J. A recurrent network mechanism of time integration in perceptual decisions. J Neurosci. 2006;26(4):1314–28.


O5 A simple model of retinal response to multi-electrode stimulation

Matias I. Maturana1,2, David B. Grayden2,3, Shaun L. Cloherty4, Tatiana Kameneva2, Michael R. Ibbotson1,5, Hamish Meffin1,5

1National Vision Research Institute, Australian College of Optometry, 3053, Australia; 2NeuroEngineering Laboratory, Dept. Electrical & Electronic Eng., University of Melbourne, 3010, Australia; 3Centre for Neural Engineering, University of Melbourne, 3010, Australia; 4Department of Physiology, Monash University, 3800, Australia; 5ARC Centre of Excellence for Integrative Brain Function, Department Optometry and Vision Sciences, University of Melbourne, 3010, Australia

Correspondence: Hamish Meffin - hmeffin@unimelb.edu.au

BMC Neuroscience 2016, 17(Suppl 1):O5

Retinal implants can restore vision to patients suffering photoreceptor loss by stimulating surviving retinal ganglion cells (RGCs) via an array of microelectrodes implanted within the eye [1]. However, the acuity offered by existing devices is low, limiting the benefits to patients. Improvements may come by increasing the number of electrodes in new devices and providing patterned vision, which necessitates stimulation using multiple electrodes simultaneously. However, simultaneous stimulation poses a number of problems due to cross-talk between electrodes and uncertainty regarding the resulting activation pattern.

Here, we present a model and methods for estimating the responses of RGCs to simultaneous electrical stimulation. Whole cell in vitro patch clamp recordings were obtained from 25 RGCs with various morphological types in rat retina. The retinae were placed onto an array of 20 stimulating electrodes. Biphasic current pulses with 500 µs phase duration and 50 µs interphase gap were applied simultaneously to all electrodes at a frequency of 10 Hz, with the amplitude of current on each electrode sampled independently from a Gaussian distribution.

A linear-nonlinear model was fit to the responses of each RGC using spike-triggered covariance analyses on 80 % of the recorded data. The analysis revealed a single significant principle component corresponding to the electrical receptive field for each cell, with the second largest principle component having negligible effect on the neural response (Fig. 3a). This indicates that interactions between electrodes are approximately linear in their influence on the cells’ responses.
Fig. 3

a Spike triggered covariance showing the full set of stimuli (black dots) projected onto the first two principle components. Stimuli causing a spike formed two clusters: net cathodic first pulses (blue) and net anodic first pulse (red). b Electrical receptive fields superimposed on the electrode array are shown for the cathodic first (blue) and anodic first clusters (red)

Furthermore, the spike-triggered ensemble showed two clusters (red and blue in Fig. 3a) corresponding to stimulation that had a net effect that was either anodic first or cathodic first. The electrical receptive fields for both anodic first and cathodic first stimulation were highly similar (Fig. 3b). They consisted of a small number (1–4) of electrodes that were close to the cell body (green dot).

The remaining 20 % of data were used to validate the model. The average model prediction root-mean-square error was 7 % over the 25 cells. The accuracy of the model indicates that the linear-nonlinear model is appropriate to describe the responses of RGCs to electrical stimulation.

Acknowledgements: This research was supported by the Australian Research Council (ARC). MI, HM, and SC acknowledge support through the Centre of Excellence for Integrative Brain Function (CE140100007), TK through ARC Discovery Early Career Researcher Award (DE120102210) and HM and TK through the ARC Discovery Projects funding scheme (DP140104533).

  1. 1.

    Hadjinicolaou AE, Meffin H, Maturana M, Cloherty SL, Ibbotson MR. Prosthetic vision: devices, patient outcomes and retinal research. Clin Exp Optom. 2015;98(5):395–410.


O6 Noise correlations in V4 area correlate with behavioral performance in visual discrimination task

Veronika Koren1,2, Timm Lochmann1,2, Valentin Dragoi3, Klaus Obermayer1,2

1Institute of Software Engineering and Theoretical Computer Science, Technische Universitaet Berlin, Berlin, 10587, Germany; 2 Bernstein Center for Computational Neuroscience Berlin, Humboldt-Universitaet zu Berlin, Berlin, 10115, Germany; 3Department of Neurobiology and Anatomy, University of Texas-Houston Medical School, Houston, TX 77030, USA

Correspondence: Veronika Koren - veronika.koren@bccn-berlin.de

BMC Neuroscience 2016, 17(Suppl 1):O6

Linking sensory coding and behavior is a fundamental question in neuroscience. We have addressed this issue in behaving monkey visual cortex (areas V1 and V4) while animals were trained to perform a visual discrimination task in which two successive images were either rotated with respect to each other or were the same. We hypothesized that the animal’s performance in the visual discrimination task depends on the quality of stimulus coding in visual cortex. We tested this hypothesis by investigating the functional relevance of neuronal correlations in areas V1 and V4 in relation to behavioral performance. We measured two types of correlations: noise (spike count) correlations and correlations in spike timing. Surprisingly, both methods showed that correct responses are associated with significantly higher correlations in V4, but not V1, during the delay period between the two stimuli. This suggests that pair-wise interactions during the spontaneous activity preceding the arrival of the stimulus sets the stage for subsequent stimulus processing and importantly influences behavioral performance.

Experiments were conducted in 2 adult monkeys that were previously trained for the task. After 300 ms of fixation, the target stimulus, consisting of a naturalistic stimulus, is shown for 300 ms, and after a random delay period (500–1200 ms), a test stimulus is shown for 300 ms. The test can either be identical to the target stimulus (match) or rotated with respect to the target (non-match). Monkey responded by pressing a button and was rewarded for a correct response with fruit juice. Two linear arrays with 16 recording channels each were used to record population activity in areas V1 and V4. The difficulty of the task is calibrated individually to have 70 % correct responses on average. The analysis is conducted on non-match condition, comparing activity in trials with correct responses with trials where the monkey responded incorrectly. Noise correlations were assessed as pair-wise correlations of spike counts (method 1) and of spike timing (method 2). For method 1, z-scores of spike counts of binned spike trains are computed in individual trials. r_sc is computed as Pearson correlation coefficient of z-scores in all available trials, balanced across correct/incorrect condition. For the method 2, cross-correlograms were computed, from which the cross-correlograms from shuffled trials are subtracted. Resulting function was summed around zero lag and normalized with sum of autocorrelograms [1].

While firing rates of single units or of the population did not significantly change for correct and incorrect responses, noise correlations during the delay period were significantly higher in V4 pairs, computed with both r_sc method (p = 0.0005 in monkey 1, sign-rank test) and with r_ccg method (p = 0.0001 and p = 0.0280 in monkey 1 and 2, respectively, 50 ms integration window). This result is robust to changes in the length of the bin (method 1) and to the length of the summation window (method 2). In agreement with [2], we confirm the importance of spontaneous activity preceding the stimulus on performance and suggest that higher correlations in V4 might be beneficial for successful read-out and reliable transmission of the information downstream.

  1. 1.

    Bair W, Zohary E, Newsome WT. Correlated firing in macaque visual area MT: time scales and relationship to behavior. J Neurosci. 2001; 21(5):1676–97.

  2. 2.

    Gutnisky DA, Beaman CB, Lew SE, Dragoi V. Spontaneous fluctuations in visual cortical responses influence population coding accuracy. Cereb Cortex. 2016;1–19.

  3. 3.

    Cohen MR, Maunsell JH. Attention improves performance primarily by reducing interneuronal correlations. Nat Neurosci. 2009;12(12):1594–1600.

  4. 4.

    Nienborg HR, Cohen MR, Cumming BG. Decision-related activity in sensory neurons: correlations among neurons and with behavior. Annu Rev Neurosci. 2012;35:463–83.


O7 Input-location dependent gain modulation in cerebellar nucleus neurons

Maria Psarrou1, Maria Schilstra1, Neil Davey1, Benjamin Torben-Nielsen1, Volker Steuber1

Centre for Computer Science and Informatics Research, University of Hertfordshire, Hatfield, AL10 9AB, UK

Correspondence: Maria Psarrou - m.psarrou@herts.ac.uk

BMC Neuroscience 2016, 17(Suppl 1):O7

Gain modulation is a brain-wide principle of neuronal computation that describes how neurons integrate inputs from different presynaptic sources. A gain change is a multiplicative operation that is defined as a change in the sensitivity (or slope of the response amplitude) of a neuron to one set of inputs (driving input) which results from the activity of a second set of inputs (modulatory input) [1, 2].

Different cellular and network mechanisms have been proposed to underlie gain modulation [2–4]. It is well established that input features such as synaptic noise and plasticity can contribute to multiplicative gain changes [2–4]. However, the effect of neuronal morphology on gain modulation is relatively unexplored. Neuronal inputs to the soma and dendrites are integrated in a different manner: whilst dendritic saturation can introduce a strong non-linear relationship between dendritic excitation and somatic depolarization, the relationship between somatic excitation and depolarization is more linear. The non-linear integration of dendritic inputs can enhance the multiplicative effect of shunting inhibition in the presence of noise [3].

Neurons in the cerebellar nuclei (CN) provide the main gateway from the cerebellum to the rest of the brain. Understanding how inhibitory inputs from cerebellar Purkinje cells interact with excitatory inputs from mossy fibres to control output from the CN is at the center of understanding cerebellar computation. In the present study, we investigated the effect of inhibitory modulatory input on CN neuronal output when the excitatory driving input was delivered at different locations in the CN neuron. We used a morphologically realistic conductance based CN neuron model [5] and examined the change in output gain in the presence of distributed inhibitory input under two conditions: (a) when the excitatory input was confined to one compartment (the soma or a dendritic compartment) and, (b), when the excitatory input was distributed across particular dendritic regions at different distances from the soma. For both of these conditions, our results show that the arithmetic operation performed by inhibitory synaptic input depends on the location of the excitatory synaptic input. In the presence of distal dendritic excitatory inputs, the inhibitory input has a multiplicative effect on the CN neuronal output. In contrast, excitatory inputs at the soma or proximal dendrites close to the soma undergo additive operations in the presence of inhibitory input. Moreover, the amount of the multiplicative gain change correlates with the distance of the excitatory inputs from the soma, with increasing distances from the soma resulting in increased gain changes and decreased additive shifts along the input axis. These results indicate that the location of synaptic inputs affects in a systematic way whether the input undergoes a multiplicative or additive operation.

  1. 1.

    Salinas E, Sejnowski TJ. Gain modulation in the central nervous system: where behavior, neurophysiology, and computation meet. Neuroscientist. 2001;7(5):430–40.

  2. 2.

    Silver RA. Neuronal arithmetic. Nat Rev Neurosci. 2010;11(7):474–89.

  3. 3.

    Prescott SA, De Koninck Y. Gain control of firing rate by shunting inhibition: roles of synaptic noise and dendritic saturation. Proc Natl Acad Sci USA. 2003;100(4):2076–81.

  4. 4.

    Rothman J, Cathala L, Steuber V, Silver RA. Synaptic depression enables neuronal gain control. Nature. 2009;475:1015–18.

  5. 5.

    Steuber V, Schultheiss NW, Silver RA, De Schutter E, Jaeger D. Determinants of synaptic integration and heterogeneity in rebound firing explored with data-driven models of deep cerebellar nucleus cells. J Comput Neurosci. 2011;30(3):633–58.


O8 Analytic solution of cable energy function for cortical axons and dendrites

Huiwen Ju1, Jiao Yu2, Michael L. Hines3, Liang Chen4 and Yuguo Yu1

1School of Life Science and the Collaborative Innovation Center for Brain Science, Fudan University, Shanghai, 200438, China; 2Linyi Hospital of Traditional Chinese Medicine, 211 Jiefang Road, Lanshan, Linyi, Shandong Province, 276000, China; 3Department of Neuroscience, Yale University School of Medicine, New Haven, CT 06520, USA; 4Department of Neurosurgery, Huashan Hospital, Shanghai Medical College, Fudan University, Shanghai, China

Correspondence: Yuguo Yu - yuyuguo@fudan.edu.cn

BMC Neuroscience 2016, 17(Suppl 1):O8

Accurate estimation of action potential (AP)-related metabolic cost is essential for understanding energetic constraints on brain connections and signaling processes. Most previous energy estimates of the AP were obtained using the Na+-counting method [1, 2], which seriously limits accurate assessment of metabolic cost of ionic currents that underlie AP generation. Moreover, the effects of axonal geometry and ion channel distribution on energy consumption related to AP propagation have not been systematically investigated.

To address these issues, we return to the cable theory [3] that underlies our HH-type cortical axon model [4], which was constructed based on experimental measurements. Based on the cable equation that describes how ion currents flow along the cable as well as analysis of the electrochemical energy in the equivalent circuit, we derived the electrochemical energy function for the cable model,
$$ \begin{aligned} \frac{{\partial^{2} E}}{\partial x\partial t} & = I_{Na} \left( {V - V_{Na} } \right) + I_{K} \left( {V - V_{K} } \right) + I_{L} \left( {V - V_{L} } \right) - \frac{1}{2\pi a}i_{a} \frac{\partial V}{\partial x} \\ & = g_{Na}^{\hbox{max} } m^{3} h\left( {V\left( {x,t} \right) - V_{Na} } \right)^{2} + g_{K}^{\hbox{max} } n^{4} \left( {V\left( {x,t} \right) - V_{K} } \right)^{2} \\ & \quad + g_{L} \left( {V\left( {x,t} \right) - V_{L} } \right)^{2} + G_{a} \left( {\frac{\partial V}{\partial x}} \right)^{2} \\ \end{aligned} $$
where g Na max (in a range of 50–650 mS/cm2), g K max (5–100 mS/cm2), and gL = 0.033 mS/cm2 are the maximal sodium, maximal potassium, and leak conductance per unit membrane area, respectively; and VNa = 60, VK = −90 VL = −70 mV are the reversal potentials of the sodium, potassium, and leak channels, respectively. The gate variables m, h, and n are dimensionless activation and inactivation variables, which describe the activation and inactivation processes of the sodium and potassium channels [4]. This equation describes the AP-related energy consumption rate per unit membrane area (cm2/s) at any axonal distance and any time. The individual terms on the right-hand side of the equation represent the contributions of the sodium, potassium, leak, and axial currents, respectively. Then we employed the cable energy function to calculate energy consumption for unbranched axons and axons with several degrees of branching (branching level, BL). Calculations based on this function distinguish between the contributions of each item toward total energy consumption.

Our analytical approach predicts an inhomogeneous distribution of metabolic cost along an axon with either uniformly or nonuniformly distributed ion channels. The results show that the Na+-counting method severely underestimates energy cost in the cable model by 20–70 %. AP propagation along axons that differ in length may require over 15 % more energy per unit of axon area than that required by a point model. However, actual energy cost can vary greatly depending on axonal branching complexity, ion channel density distributions, and AP conduction states. We also infer that the metabolic rate (i.e. energy consumption rate) of cortical axonal branches as a function of spatial volume exhibits a 3/4 power law relationship.

Acknowledgements: Dr. Yu thanks for the support from the National Natural Science Foundation of China (31271170, 31571070), Shanghai program of Professor of Special Appointment (Eastern Scholar SHH1140004).

  1. 1.

    Alle H, Roth A, Geiger JR. Energy-efficient action potentials in hippocampal mossy fibers. Science. 2009;325(5946):1405–8.

  2. 2.

    Carter BC, Bean BP. Sodium entry during action potentials of mammalian neurons: incomplete inactivation and reduced metabolic efficiency in fast-spiking neurons. Neuron. 2009;64(6):898–909.

  3. 3.

    Rall W. Cable theory for dendritic neurons. In: Methods in neuronal modeling. MIT Press; 1989. p. 9–92.

  4. 4.

    Yu Y, Hill AP, McCormick DA. Warm body temperature facilitates energy efficient cortical action potentials. PLoS Comput Biol. 2012;8(4):e1002456.


O9 C. elegans interactome: interactive visualization of Caenorhabditis elegans worm neuronal network

Jimin Kim1, Will Leahy2, Eli Shlizerman1,3

1Department of Applied Mathematics, University of Washington, Seattle, WA 98195, USA; 2Amazon.com Inc., Seattle, WA 98108, USA; 3Department of Electrical Engineering, University of Washington, Seattle, WA 98195, USA

Correspondence: Eli Shlizerman - shlizee@uw.edu

BMC Neuroscience 2016, 17(Suppl 1):O9

Modeling neuronal systems involves incorporating the two layers: a static map of neural connections (connectome), and biophysical processes that describe neural responses and interactions. Such a model is called the ‘dynome’ of a neuronal system as it integrates a dynamical system with the static connectome. Being closer to reproducing the activity of a neuronal system, investigation of the dynome has more potential to reveal neuronal pathways of the network than the static connectome [1]. However, since the two layers of the dynome are considered simultaneously, novel tools have to be developed for the dynome studies. Here we present a visualization methodology, called `interactome’, that allows to explore the dynome of a neuronal system interactively and in real-time, by viewing the dynamics overlaid on a graph representation of the connectome.

We apply our methodology to the nervous system of Caenorhabditis elegans (C. elegans) worm, which connectome is almost fully resolved [2], and a computational model of neural dynamics and interactions (gap and synaptic) based on biophysical experimental findings was recently introduced [3]. Integrated together, C. elegans dynome defines a unique set of neural dynamics of the worm. To visualize the dynome, we propose a dynamic force-directed graph layout of the connectome. The layout is implemented using D3 visualization platform [4], and is designed to communicate with an integrator of the dynome. The two-way communication protocol between the layout and the integrator allows for stimulating (injecting current) into any subset of neurons at any time point (Fig. 4B). It also allows for simultaneously viewing the response of the network on top of the layout visualized by resizing graph nodes (neurons) according to their voltage. In addition, we support structural changes in the connectome, such as ablation of neurons and connections.
Fig. 4

A Visualization of C. elegans dynome, B communication diagram between the dynome and the layout, C snapshots of visualization of C. elegans during the PLM/AVB excitations (forward crawling)

Our visualization and communication protocols thereby display the stimulated network in an interactive manner and permit to explore different regimes that the stimulations induce. Indeed, with the interactome we are able to recreate various experimental scenarios, such as stimulation of forward crawling (PLM/AVB neurons and/or ablation of AVB) and show that its visualization assists in identifying patterns of neurons in the stimulated network. As connectomes and dynomes of additional neuronal systems are being resolved, the interactome will enable exploring their functionality and inference to its underlying neural pathways [5].

  1. 1.

    Kopell NJ, Gritton HJ, Whittingon MA, Kramer MA. Beyond the connectome: the dynome. Neuron. 2014;83(6):1319–28.

  2. 2.

    Varshney LR, Chen BL, Paniagua E, Hall DH, Chkolvski DB. Structural properties of the caenorhabditis elegans neuronal network. PLoS Comput Biol. 2011;7(2):e1001066.

  3. 3.

    Kunert J, Shlizerman E, Kutz JN. Low-dimensional functionality of complex network dynamics: neurosensory integration in the Caenorhabditis elegans connectome. Phys Rev E. 2014;89(5):052805.

  4. 4.

    Bostock M, Ogievetsky V, Heer J. D3 data-driven documents. IEEE. 2011;17(12):2301–9.

  5. 5.

    Kim J, Leahy W, Shlizerman E. C. elegans interactome: interactive visualization of Caenorhabditis elegans worm neuronal network. 2016 (in submission).


O10 Is the model any good? Objective criteria for computational neuroscience model selection

Justas Birgiolas1, Richard C. Gerkin1, Sharon M. Crook1,2

1School of Life Science, Arizona State University, Tempe, AZ 85287, USA; 2School of Mathematical and Statistical Sciences, Arizona State University, Tempe, AZ, 85287, USA

Correspondence: Justas Birgiolas - justas@asu.edu

BMC Neuroscience 2016, 17(Suppl 1):O10

Objectively evaluating and selecting computational models of biological neurons is an ongoing challenge in the field. Models vary in morphological detail, channel mechanisms, and synaptic transmission implementations. We present the results of an automated method for evaluating computational models against property values obtained from published cell electrophysiology studies. Seven published deterministic models of olfactory bulb mitral cells were selected from ModelDB [1] and simulated using NEURON’s Python interface [2]. Passive and spike properties in response to step current stimulation pulses were computed using the NeuronUnit [3] package and compared to their respective, experimentally obtained means of olfactory bulb mitral cell properties found in the NeuroElectro database [4].

Results reveal that across all models, the resting potential and input resistance property means deviated the most from their experimentally measured means (Rinput t test p = 0.02, Vrest Wilcoxon-test p = 0.01). The time constant, spike half-width, spike amplitude, and spike threshold properties, in the order of decreasing average deviation, matched well with experimental data (p > 0.05) (Fig. 5 top).
Fig. 5

The average deviations of models and cell electrophysiology properties as measured in multiples of the 95 % CI bounds of experimental data means. Dashed line represents 1 CI bound threshold. Top rows show average deviations across all models for each cell property. Bottom rows show deviations across all cell properties for each model

In three models, the property deviations were, on average, outside the 95 % CI of the experimental means (Fig. 5 bottom), but these averages were not significant (t test p > 0.05). All other models were within the 95 % CI, while the model of Chen et al. had the lowest deviation [5].

Overall, the majority of these olfactory bulb mitral cell models display some properties that are not significantly different from their experimental means. However, the resting potential and input resistance properties significantly differ from the experimental values. We demonstrate that NeuronUnit provides an objective method for evaluating the fitness of computational neuroscience cell models against publicly available data.

Acknowledgements: The work of JB, RG, and SMC was supported in part by R01MH1006674 from the National Institutes of Health.

  1. 1.

    Hines ML, Morse T, Migliore M, Carnevale NT, Shepherd GM. ModelDB: a database to support computational neuroscience. J Comput Neurosci. 2004;17(1):7–11.

  2. 2.

    Hines M, Davison AP, Muller E. NEURON and Python. Front Neuroinform. 2009;3:1.

  3. 3.

    Omar C, Aldrich J, Gerkin RC. Collaborative infrastructure for test-driven scientific model validation. In: Companion proceedings of the 36th international conference on software engineering. ACM; 2014. p. 524–7.

  4. 4.

    Tripathy SJ, Savitskaya J, Burton SD, Urban NN, Gerkin RC. NeuroElectro: a window to the world’s neuron electrophysiology data. Front Neuroinform. 2014;8.

  5. 5.

    Chen WR, Shen GY, Shepherd GM, Hines ML, Midtgaard J. Multiple modes of action potential initiation and propagation in mitral cell primary dendrite. J Neurophysiol. 2002;88(5):2755–64.


O11 Cooperation and competition of gamma oscillation mechanisms

Atthaphon Viriyopase1,2,3, Raoul-Martin Memmesheimer1,3,4, and Stan Gielen1,2

1Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen (Medical Centre), The Netherlands; 2Department for Biophysics, Faculty of Science, Radboud University Nijmegen, The Netherlands; 3Department for Neuroinformatics, Faculty of Science, Radboud University Nijmegen, The Netherlands; 4Center for Theoretical Neuroscience, Columbia University, New York, NY, USA

Correspondence: Atthaphon Viriyopase - a.viriyopase@science.ru.nl

BMC Neuroscience 2016, 17(Suppl 1):O11

Two major mechanisms that underlie gamma oscillations are InterNeuronal Gamma (“ING”), which is related to tonic excitation of reciprocally coupled inhibitory interneurons (I-cells), and Pyramidal InternNeuron Gamma (“PING”), which is mediated by coupled populations of excitatory pyramidal cells (E-cells) and I-cells. ING and PING are thought to serve different biological functions. Using computer simulations and analytical methods, we [1] therefore investigate which mechanism (ING or PING) will dominate the dynamics of a network when ING and PING interact and how the dominant mechanism may switch.

We find that ING and PING oscillations compete: The mechanism generating the higher oscillation frequency “wins”. It determines the frequency of the network oscillations and suppresses the other mechanism. The network oscillation frequency (green lines corresponding to the network topology given in Fig. 6C) corresponding to the network with type-I-phase-response-curve interneurons and type-II-phase-response-curve interneurons is plotted in Fig. 6D, E, respectively. We explain our simulation results by a theoretical model that allows a full theoretical analysis.
Fig. 6

Oscillations in full and reduced networks of reciprocally coupled pyramidal cells and interneurons. A, B Illustrate topologies of reduced networks that generate “pure” ING and “pure” PING, respectively, while C highlights the topology of a “full” network that could in principle generate either ING or PING oscillations or mixtures of both. D, E Frequency of pure ING-rhythm generated by the reduced network in A (blue line), pure PING-rhythm generated by the reduced network in b (red line), and rhythms generated by the full network in C (green line) as a function of mean current to I-cells I0,I and as function of mean current to E-cells I0,E, respectively. D Results for networks with type-I interneurons while E shows results for networks with type-II interneurons. Pyramidal cells are modeled as type-I Hodgkin–Huxley neurons

Our study suggests experimental approaches to decide whether oscillatory activity in networks of interacting excitatory and inhibitory neurons is dominated by ING or PING oscillations and whether the participating interneurons belong to class I or II. Consider as an example networks with type-I interneurons where the external drive to the E-cells, I0,E, is kept constant while the external drive to the I-cells, I0,I, is varied. For both ING and PING dominated oscillations the frequency of the rhythm increases when I0,I increases (cf. Fig. 6D). Observing such an increase does therefore not allow to determine the underlying mechanism. However, the absolute value of the first derivative of the frequency with respect to I0,I allows a distinction, as it is much smaller for PING than for ING (cf. Fig. 6D). In networks with type-II interneurons, the non-monotonic dependence near the ING-PING transition may be a characteristic hallmark to detect the oscillation character (and the interneuron type): Decrease (increase) of the frequency when increasing I0,E indicates ING (PING), cf. Fig. 6E. These theoretical predictions are in line with experimental evidence [2].

  1. 1.

    Viriyopase A, Memmesheimer RM, Gielen S. Cooperation and competition of gamma oscillation mechanisms. J Neurophysiol. 2016.

  2. 2.

    Craig MT, McBain CJ. Fast gamma oscillations are generated intrinsically in CA1 without the involvement of fast-spiking basket cells. J Neurosci. 2015;35(8):3616–24.


O12 A discrete structure of the brain waves

Yuri Dabaghian1,2, Justin DeVito1, Luca Perotti3

1Department of Neurology Pediatrics, Baylor College of Medicine, Houston, TX 77030, USA; 2Department of Computational and Applied Mathematics, Rice University, Houston, TX, 77005, USA; 3Physics Department, Texas Southern University, 3100 Cleburne St, Houston, TX 77004, USA

Correspondence: Yuri Dabaghian - dabaghian@rice.edu

BMC Neuroscience 2016, 17(Suppl 1):O12

A physiological interpretation of the biological rhythms, e.g., of the local field potentials (LFP) depends on the mathematical and computational approaches used for its analysis. Most existing mathematical methods of the LFP studies are based on braking the signal into a combination of simpler components, e.g., into sinusoidal harmonics of Fourier analysis or into wavelets of the Wavelet Analysis. However, a common feature of all these methods is that their prime components are presumed from the onset, and the goal of the subsequent analysis reduces to identifying the combination that best reproduces the original signal.

We propose a fundamentally new method, based on a number of deep theorems of complex function theory, in which the prime components of the signal are not presumed a priori, but discovered empirically [1]. Moreover, the new method is more flexible and more sensitive to the signal’s structure than the standard Fourier method.

Applying this method reveals a fundamentally new structure in the hippocampal LFP signals in rats in mice. In particular, our results suggest that the LFP oscillations consist of a superposition of a small, discrete set of frequency modulated oscillatory processes, which we call “oscillons”. Since these structures are discovered empirically, we hypothesize that they may capture the signal’s actual physical structure, i.e., the pattern of synchronous activity in neuronal ensembles. Proving this hypothesis will help enormously to advance a principal, theoretical understanding of the neuronal synchronization mechanisms. We anticipate that it will reveal new information about the structure of the LFP and other biological oscillations, which should provide insights into the underlying physiological phenomena and the organization of brains states that are currently poorly understood, e.g., sleep and epilepsy.

Acknowledgements: The work was supported by the NSF 1422438 grant and by the Houston Bioinformatics Endowment Fund.

  1. 1.

    Perotti L, DeVito J, Bessis D, Dabaghian Y, Dabaghian Y, Brandt VL, Frank LM. Discrete spectra of brain rhythms (in submisison).


O13 Direction-specific silencing of the Drosophila gaze stabilization system

Anmo J. Kim1,†, Lisa M. Fenk1,†, Cheng Lyu1, Gaby Maimon1

1Laboratory of Integrative Brain Function, The Rockefeller University, New York, NY 10065, USA

Correspondence: Anmo J. Kim - anmo.kim@gmail.com

Authors contributed equally

BMC Neuroscience 2016, 17(Suppl 1):O13

Many animals, including insects and humans, stabilize the visual image projected onto their retina by following a rotating landscape with their head or eyes. This stabilization reflex, also called the optomotor response, can pose a problem, however, when the animal intends to change its gaze. To resolve this paradox, von Holst and Mittelstaedt proposed that a copy of the motor command, or efference copy, could be routed into the visual system to transiently silence this stabilization reflex when an animal changes its gaze [1]. Consistent with this idea, we recently demonstrated that a single identified neuron associated with the optomotor response receives silencing motor-related inputs during rapid flight turns, or saccades, in tethered, flying Drosophila [2].

Here, we expand on these results by comprehensively recording from a group of optomotor-mediating visual neurons in the fly visual system: three horizontal system (HS) and six vertical system (VS) cells. We found that the amplitude of motor-related inputs to each HS and VS cell correlates strongly with the strength of each cell’s visual sensitivity to rotational motion stimuli around the primary turn axis, but not to the other axes (Fig. 7). These results support the idea that flies send rotation-axis-specific efference copies to the visual system during saccades—silencing the stabilization reflex only for a specific axis, but leaving the others intact. This is important because saccades consist of stereotyped banked turns, which involve body rotations around all three primary axes of rotation. If the gaze stabilization system is impaired for only one of these axes, then the fly is expected to attempt to maintain gaze stability, through a combination of head and body movements, for the other two. This prediction is consistent with behavioral measurements of head and body kinematics during saccades in freely flying blow flies [3]. Together, these studies provide an integrative model of how efference copies counteract a specific aspect of visual feedback signals to tightly control the gaze stabilization system.
Fig. 7

The amplitudes of saccade-related potentials (SRPs) to HS and VS cells are strongly correlated with each cell’s visual sensitivity to rightward yaw motion stimuli. A Experimental apparatus. B Maximal-intensity z-projections of the lobula plate to visualize HS- or VS-cell neurites that are marked by a GAL4 enhancer trap line. C, D The amplitude of saccade-related potentials (SRPs) were inversely correlated with visual responses, when measured under rightward yaw motion stimuli, but not under clockwise roll motion stimuli. Each sample point corresponds to each cell type. Error bars indicate SEM

  1. 1.

    von Holst E, Mittelstaedt H. The principle of reafference. Naturwissenschaften.1950;37:464–76.

  2. 2.

    Kim AJ, Fitzgerald JK, Maimon G. Cellular evidence for efference copy in Drosophila visuomotor processing. Nat Neurosci. 2015;18:1247–55.

  3. 3.

    Schilstra C, van Hateren JH. Stabilizing gaze in flying blowflies. Nature. 1998;395:654.


O14 What does the fruit fly think about values? A model of olfactory associative learning

Chang Zhao1, Yves Widmer2, Simon Sprecher2, Walter Senn1

1Department of Physiology, University of Bern, Bern, 3012, Switzerland; 2Department of Biology, University of Fribourg, Fribourg, 1700, Switzerland

Correspondence: Chang Zhao - zhao@pyl.unibe.ch

BMC Neuroscience 2016, 17(Suppl 1):O14

Associative learning in the fruit fly olfactory system has been studied from the molecular to the behavior level [1, 2]. Fruit flies are able to associate conditional stimuli such as odor with unconditional aversive stimuli such as electrical shocks, or appetitive stimuli such as sugar or water. The mushroom body in the fruit fly brain is considered to be crucial for olfactory learning [1, 2]. The behavioral experiments show that the learning can not be explained simply by an additive Hebbian (i.e. correlation-based) learning rule. Instead, it depends on the timing between the conditional and unconditional stimulus presentation. Yarali and colleagues suggested a dynamic model on the molecular level to explain event timing in associative learning [3]. Here, we present new experiments together with a simple phenomenological model for learning that shows that associative olfactory learning in the fruit fly represents value learning that is incompatible with Hebbian learning.

In our model, the information of the conditional odor stimulus is conveyed by Kenyon cells from the projection neurons to the mushroom output neurons; the information of the unconditional shock stimulus is represented by dopaminergic neurons to the mushroom output neurons through direct or indirect pathways. The mushroom body output neurons encode the internal value (v) of the odor (o) by synaptic weights (w) that conveys the odor information, v = w∙o. The synaptic strength is updated according to the value learning rule, Δw = η(s − v)õ, where s represents the (internal) strength of the shock stimulus, õ represents the synaptic odor trace, and η is the learning rate. The value associated with the odor determines the probability of escaping from that odor. This simple model reproduces the behavioral data and shows that olfactory conditioning in the fruit fly is in fact value learning. In contrast to the prediction of Hebbian learning, the escape probability for repeated odor-shock pairings is much lower than the escape probability for a single pairing with a correspondingly stronger shock.

  1. 1.

    Aso Y, Sitaraman D, Ichinose T, Kaun KR, Vogt K, Belliart-Gurin G, Plaais PY, Robie AA, Yamagata N, Schnaitmann C, Rowell WJ, Johnston RM, Ngo TB, Chen N, Korff W, Nitabach MN, Heberlein U, Preat T, Branson KM, Tanimoto H, Rubin GM: Mushroom body output neurons encode valence and guide memory-based action selection in Drosophila. ELife. 2014;3:e04580.

  2. 2.

    Heisenberg M. Mushroom body memoir: from maps to models. Nat Rev Neurosci. 2003;4:266–75.

  3. 3.

    Yarali A, Nehrkorn J, Tanimoto H, Herz AVM. Event timing in associative learning: from biochemical reaction dynamics to behavioural observations. PLoS One. 2012;7(3):e32885.


O15 Effects of ionic diffusion on power spectra of local field potentials (LFP)

Geir Halnes1, Tuomo Mäki-Marttunen2, Daniel Keller3, Klas H. Pettersen4,5,Ole A. Andreassen2, Gaute T. Einevoll1,6

1Department of Mathematical Sciences and Technology, Norwegian University of Life Sciences, Ås, Norway; 2NORMENT, Institute of Clinical Medicine, University of Oslo, Oslo, Norway; 3The Blue Brain Project, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland; 4Letten Centre and Glialab, Department of Molecular Medicine, Instotute of Basic Medical Sciences, University of Oslo, Oslo, Norway; 5Centre for Molecular Medicine Norway, University of Oslo, Oslo, Norway; 6Department of Physics, University of Oslo, Oslo, Norway

Correspondence: Geir Halnes - geir.halnes@nmbu.no

BMC Neuroscience 2016, 17(Suppl 1):O15

The local field potential (LFP) in the extracellular space (ECS) of the brain, is a standard measure of population activity in neural tissue. Computational models that simulate the relationship between the LFP and its underlying neurophysiological processes are commonly used in the interpretation such measurements. Standard methods, such as volume conductor theory [1], assume that ionic diffusion in the ECS has negligible impact on the LFP. This assumption could be challenged during endured periods of intense neural signalling, under which local ion concentrations in the ECS can change by several millimolars. Such concentration changes are indeed often accompanied by shifts in the ECS potential, which may be partially evoked by diffusive currents [2]. However, it is hitherto unclear whether putative diffusion-generated potential shifts are too slow to be picked up in LFP recordings, which typically use electrode systems with cut-off frequencies at ~0.1 Hz.

To explore possible effects of diffusion on the LFP, we developed a hybrid simulation framework: (1) The NEURON simulator was used to compute the ionic output currents from a small population of cortical layer-5 pyramidal neurons [3]. The neural model was tuned so that simulations over ~100 s of biological time led to shifts in ECS concentrations by a few millimolars, similar to what has been seen in experiments [2]. (2) In parallel, a novel electrodiffusive simulation framework [4] was used to compute the resulting dynamics of the potential and ion concentrations in the ECS, accounting for the effect of electrical migration as well as diffusion. To explore the relative role of diffusion, we compared simulations where ECS diffusion was absent with simulations where ECS diffusion was included.

Our key findings were: (i) ECS diffusion shifted the local potential by up to ~0.2 mV. (ii) The power spectral density (PSD) of the diffusion-evoked potential shifts followed a 1/f 2 power law. (iii) Diffusion effects dominated the PSD of the ECS potential for frequencies up to ~10 Hz (Fig. 8). We conclude that for large, but physiologically realistic ECS concentration gradients, diffusion could affect the ECS potential well within the frequency range considered in recordings of the LFP.
Fig. 8

Power spectrum of ECS potential in a simulation including ECS diffusion (blue line) and a simulation without ECS diffusion (red line). Units for frequency and power are Hz and mV2/Hz, respectively

  1. 1.

    Holt G, Koch C. Electrical interactions via the extracellular potential near cell bodies. J Comput Neurosci. 1999;6:169–84.

  2. 2.

    Dietzel I, Heinemann U, Lux H. Relations between slow extracellular potential changes, glial potassium buffering, and electrolyte and cellular volume changes during neuronal hyperactivity in cat. Glia. 1989;2:25–44.

  3. 3.

    Hay E, Hill S, Schürmann F, Markram H, Segev I. Models of neocortical layer 5b pyramidal cells capturing a wide range of dendritic and perisomatic active properties. PLoS Comput Biol. 2011;7(7):e1002107.

  4. 4.

    Halnes G, Østby I, Pettersen KH, Omholt SW, Einevoll GT: Electrodiffusive model for astrocytic and neuronal ion concentration dynamics. PLoS Comput Biol. 2013;9(12):e1003386.


O16 Large-scale cortical models towards understanding relationship between brain structure abnormalities and cognitive deficits

Yasunori Yamada1

1IBM Research - Tokyo, Japan

Correspondence: Yasunori Yamada - ysnr@jp.ibm.com

BMC Neuroscience 2016, 17(Suppl 1):O16

Brain connectivity studies have revealed fundamental properties of normal brain network organization [1]. In parallel, they have reported structural connectivity abnormalities in brain diseases such as Alzheimer’s disease (AD) [1, 2]. However, how these structural abnormalities affect information processing and cognitive functions involved in brain diseases is still poorly understood. To deepen our understanding of this causal link, I developed two large-scale cortical models with normal and abnormal structural connectivity of diffusion tensor imaging on aging APOE-4 non-carriers and carriers in the USC Multimodal Connectivity Database [2, 3]. The possession of the APOE-4 allele is one of the major risk factors in developing later AD, and it has known abnormalities in structural connectivity characterized by lower network communication efficiency in terms of local interconnectivity and balance of integration and interconnectivity [2]. The two cortical models share other parameters and consist of 2.4 million spiking neurons and 4.8 billion synaptic connections. First, I demonstrate the biological relevance of the models by confirming that they reproduce normal patterns of cortical spontaneous activities in terms of the following distinctive properties observed in vivo [4]: low firing rates of individual neurons that approximate log-normal distributions, irregular spike trains following a Poisson distribution, a network balance between excitation and inhibition, and greater depolarization of the average membrane potentials. Next, to investigate how the difference in structural connectivity affects cortical information processing, I compare cortical response properties to an input during spontaneous activity between the cortical models. The results show that the cortical model with the abnormal structural connectivity decreased the degree of cortical response as well as the number of cortical regions responding to the input (Fig. 9), suggesting that the structural connectivity abnormality observed in APOE-4 carriers might reduce cortical information propagation and lead to negative effects in information integration. Indeed, imaging studies support this suggestion by reporting structural abnormality with lower network communication efficiency observed in the structural connectivity of both APOE-4 carriers and AD patients [1, 2]. This computational approach allowing for manipulations and detailed analyses that are difficult or impossible in human studies can help to provide a causal understanding of how cognitive deficits in patients with brain diseases are associated with their underlying structural abnormalities.
Fig. 9

Responses to input to the left V1 in the two cortical models with normal/abnormal structural connectivity. A Average firing rates. BD Cortical regions and cortical areas that significantly responded to the input

Acknowledgements: This research was partially supported by the Japan Science and Technology Agency (JST) under the Strategic Promotion of Innovative Research and Development Program.

  1. 1.

    Stam CJ. Modern network science of neurological disorders. Nat Rev Neurosci. 2014;15(10):683–695.

  2. 2.

    Brown JA, Terashima KH, Burggren AC, Ercoli LM, Miller KJ, Small GW, Bookheimer SY. Brain network local interconnectivity loss in aging APOE-4 allele carriers. Proc Natl Acad Sci USA. 2011;108(51):20760–5.

  3. 3.

    Brown JA, Rudie JD, Bandrowski A, van Horn JD, Bookheimer SY. The UCLA multimodal connectivity database: a web-based platform for brain connectivity matrix sharing and analysis. Front Neuroinform. 2012;6(28).

  4. 4.

    Ikegaya Y, Sasaki T, Ishikawa D, Honma N, Tao K, Takahashi N, Minamisawa G, Ujita S, Matsuki N. Interpyramid spike transmission stabilizes the sparseness of recurrent network activity. Cereb Cortex. 2013;23(2):293–304.


O17 Spatial coarse-graining the brain: origin of minicolumns

Moira L. Steyn-Ross1, D. Alistair Steyn-Ross1

1School of Engineering, University of Waikato, Hamilton 3240, New Zealand

Correspondence: Moira L. Steyn-Ross - msr@waikato.ac.nz

BMC Neuroscience 2016, 17(Suppl 1):O17

The seminal experiments of Mountcastle [1] over 60 years ago established the existence of cortical minicolumns: vertical column-like arrays of approximately 80–120 neurons aligned perpendicular to the pial surface, penetrating all six cortical layers. Minicolumns have been proposed as the fundamental unit for cortical organisation. Minicolumn formation is thought to rely on gene expression and thalamic activity, but exactly why neurons cluster into columns of diameter 30–50 μm containing approximately 100 neurons is not known.

In this presentation we describe a mechanism for the formation of minicolumns via gap-junction diffusion-mediated coupling in a network of spiking neurons. We use our recently developed method of cortical “reblocking” (spatial coarse-graining) [2] to derive neuronal dynamics equations at different spatial scales. We are able to show that for sufficiently strong gap-junction coupling, there exists a minimum block size over which neural activity is expected to be coherent. This coherence region has cross-sectional area of order (40–60 μm)2, consistent with the areal extent of a minicolumn. Our scheme regrids a 2D continuum of spiking neurons using a spatial rescaling theory, established in the 1980s, that systematically eliminates high-wave-number modes [3]. The rescaled neural equations describe the bulk dynamics of a larger block of neurons giving “true” (rather than mean-field) population activity, encapsulating the inherent dynamics of a continuum of spiking neurons stimulated by incoming signals from neighbors, and buffeted by ion-channel and synaptic noise.

Our method relies on a perturbative expansion. In order for this coarse-graining expansion to converge, we require not only a sufficiently strong level of inhibitory gap-junction coupling, but also a sufficiently large blocking ratio B. The latter condition establishes a lower bound for the smallest “cortical block”: the smallest group of neurons that can respond to input as a collective and cooperative unit. We find that this minimum block-size ratio lies between 4 and 6. In order to relate this 2D geometric result to the 3D extent of a 3-mm-thick layered cortex, we project the cortex onto a horizontal surface and count the number of neurons contained within each l × l grid micro-cell. Setting l ≈ 10 μm and assuming an average of one interneuron per grid cell, a blocking ratio at the mid-value B = 5 implies that the side-length of a coherent “macro-cell” will be L = Bl = 50 μm containing ~25 inhibitory plus 100 excitatory neurons (assuming an i to e abundance ratio of 1:4) in cross-sectional area L 2. Thus the minicolumn volume will contain roughly 125 neurons. We argue that this is the smallest diffusively-coupled population size that can support cooperative dynamics, providing a natural mechanism defining the functional extent of a minicolumn.

We propose that minicolumns might form in the developing brain as follows: Inhibitory neurons migrate horizontally from the ganglionic eminence to form a dense gap-junction coupled substrate that permeates all layers of the cortex [4]. Progenitor excitatory cells ascend vertically from the ventricular zone, migrating through the inhibitory substrate of the cortical plate. Thalamic input provides low-level stimulus to activate spiking activity throughout the network. Inhibitory diffusive coupling allows a “coarse graining” such that neurons within a particular areal extent respond collectively to the same input. The minimum block size prescribed by the coarse graining imposes constraints on minicolumn geometry, leading to the spontaneous emergence of cylindrical columns of coherent activity, each column centered on an ascending chain of excitatory neurons and separated from neighboring chains by an annular surround of inhibition. This smallest aggregate is preferentially activated during early brain development, and activity-based plasticity then leads to the formation of tangible structural columns.

  1. 1.

    Mountcastle VB. Modality and topographic properties of single neurons of cat’s somatic sensory cortex. J Neurophysiol. 1957;20(4):408–34.

  2. 2.

    Steyn-Ross ML, Steyn-Ross DA. From individual spiking neurons to population behavior: Systematic elimination of short-wavelength spatial modes. Phys Rev E. 2016;93(2):022402.

  3. 3.

    Steyn-Ross ML, Gardiner CW. Adiabatic elimination in stochastic systems III. Phys Rev A. 1984;29(5):2834–44.

  4. 4.

    Jones EG. Microcolumns in the cerebral cortex. Proc Natl Acad Sci USA. 2000;97(10):5019–21.


O18 Modeling large-scale cortical networks with laminar structure

Jorge F. Mejias1, John D. Murray2, Henry Kennedy3, and Xiao-Jing Wang1,4

1Center for Neural Science, New York University, New York, NY, 10003, USA; 2Department of Psychiatry, Yale School of Medicine, New Haven, CT, 06511, USA; 3INSERM U846, Stem Cell and Brain Research Institute, Bron Cedex, France; 4NYU-ECNU Institute of Brain and Cognitive Science, NYU Shanghai, Shanghai, China

Correspondence: Jorge F. Mejias - jorge.f.mejias@gmail.com

BMC Neuroscience 2016, 17(Suppl 1):O18

Visual cortical areas in the macaque are organized according to an anatomical hierarchy, which is defined by specific patterns of anatomical projections in the feedforward and feedback directions [1, 2]. Recent macaque studies also suggest that signals ascending through the visual hierarchy are associated with gamma rhythms, and top-down signals with alpha/low beta rhythms [3–5]. It is not clear, however, how oscillations presumably originating at local populations can give rise to such frequency-specific large-scale interactions in a mechanistic way, or the role that anatomical projections patterns might have in this.

To address this question, we build a large-scale cortical network model with laminar structure, grounding our model on a recently obtained anatomical connectivity matrix with weighted directed inter-areal projections and information about their laminar origin. The model involves several spatial scales—local or intra-laminar microcircuit, inter-laminar circuits, inter-areal interactions and large-scale cortical network—and a wide range of temporal scales—from slow alpha oscillations to gamma rhythms. At any given level, the model is constrained anatomically and then tested against electrophysiological observations, which provides useful information on the mechanisms modulating the oscillatory activity at different scales. As we ascend through the local to the inter-laminar and inter-areal levels, the model allows us to explore the sensory-driven enhancement of gamma rhythms, the inter-laminar phase-amplitude coupling, the relationship between alpha waves and local inhibition, and the frequency-specific inter-areal interactions in the feedforward and feedback directions [3, 4], revealing a possible link with the predictive coding framework.

When we embed our modeling framework into the anatomical connectivity matrix of 30 areas (which includes novel areas not present in previous studies [2, 6]), the model gives insight into the mechanisms of large-scale communication across the cortex, accounts for an anatomical and functional segregation of FF and FB interactions, and predicts the emergence of functional hierarchies, which recent studies have found in macaque [4] and human [5]. Interestingly, the functional hierarchies observed experimentally are highly dynamic, with areas moving across the hierarchy depending on the behavioral context [4]. In this regard, our model provides a strong prediction: we propose that these hierarchical jumps are triggered by laminar-specific modulations of input into cortical areas, suggesting a strong link between hierarchy dynamics and context-dependent computations driven by specific inputs.

  1. 1.

    Felleman DJ, Van Essen DC. Distributed hierarchical processing in the primate cerebral cortex. Cereb Cortex. 1991;1(1):1–47.

  2. 2.

    Markov NT, Vezoli J, Chameau P, Falchier A, Quilodran R, Huissoud C, Lamy C, Misery P, Giroud P, Ullman S, et al. Anatomy of hierarchy: feedforward and feedback pathways in macaque visual cortex. J Comp Neurol. 2014;522:225–259.

  3. 3.

    van Kerkoerle T, Self MW, Dagnino B, Gariel-Mathis MA, Poort J, van der Togt C, Roelfsema PR. Alpha and gamma oscillations characterize feedback and feedforward processing in monkey visual cortex. Proc Natl Acad Sci USA. 2014;111;14332–41.

  4. 4.

    Bastos AM, Vezoli J, Bosman CA, Schoffelen JM, Oostenveld R, Dowdall JR, De Weerd P, Kennedy H, Fries P. Visual areas exert feedforward and feedback influences through distinct frequency channels. Neuron. 2015;85:390–401.

  5. 5.

    Michalareas G, Vezoli J, van Pelt S, Schoffelen JM, Kennedy H, Fries. Alpha–beta and gamma rhythms subserve feedback and feedforward influences among human visual cortical areas. Neuron. 2016;89:384–97.

  6. 6.

    Chaudhuri R, Knoblauch K, Gariel MA, Kennedy H, Wang XJ. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex. Neuron. 2015;88:419–31.


O19 Information filtering by partial synchronous spikes in a neural population

Alexandra Kruscha1,2, Jan Grewe3,4, Jan Benda3,4 and Benjamin Lindner1,2

1Bernstein Center for Computational Neuroscience, Berlin, 10115, Germany; 2Institute for Physics, Humboldt-Universität zu Berlin, Berlin, 12489, Germany; 3Institue for Neurobiology, Eberhardt Karls Universität Tübingen, Germany; 4Bernstein Center for Computational Neuroscience, Munich, Germany

Correspondence: Alexandra Kruscha - alexandra.kruscha@bccn-berlin.de

BMC Neuroscience 2016, 17(Suppl 1):O19

Synchronous firing of neurons is a prominent feature in many brain areas. Here, we are interested in the information transmission by the synchronous spiking output of a noisy neuronal population, which receives a common time-dependent sensory stimulus. Earlier experimental [1] and theoretical [2] work revealed that synchronous spikes encode preferentially fast (high-frequency) components of the stimulus, i.e. synchrony can act as an information filter. In these studies a rather strict measure of synchrony was used: the entire population has to fire within a short time window. Here, we generalize the definition of the synchronous output, for which only a certain fraction γ of the population needs to be active simultaneously—a setup that seems to be of more biological relevance. We characterize the information transfer in dependence of this fraction and the population size, by the spectral coherence function between the stimulus and the partial synchronous output. We present two different analytical approaches to derive this frequency-resolved measure (one that is more suited for small population sizes, while the second one is applicable to larger populations). We show that there is a critical synchrony fraction, namely the probability at which a single neuron spikes within the predefined time window, which maximizes the information transmission of the synchronous output. At this value, the partial synchronous output acts as a low-pass filter, whereas deviations from this critical fraction lead to a more and more pronounced band-pass filtering effect. We confirm our analytical findings by numerical simulations for the leaky integrate-and-fire neuron. We also show that these findings are supported by experimental recordungs of P-Units electroreceptors of weakly electric fish, where the filtering effect of the synchronous output occurs in real neurons as well.

Acknowledgement: This work was supported by Bundesministerium für Bildung und Forschung Grant 01GQ1001A and DFG Grant 609788-L1 1046/2-1.

  1. 1.

    Middleton JW, Longtin A, Benda J, Maler L. Postsynaptic receptive field size and spike threshold determine encoding of high-frequency information via sensitivity to synchronous presynaptic activity. J Neurophysiol. 2009;101:1160–70.

  2. 2.

    Sharafi N, Benda J, Lindner B. Information filtering by synchronous spikes in a neural population. J Comp Neurosc. 2013;34:285–301.


O20 Decoding context-dependent olfactory valence in Drosophila

Laurent Badel1, Kazumi Ohta1, Yoshiko Tsuchimoto1, Hokto Kazama1

1RIKEN Brain Science Institute, 2-1 Hirosawa, Wako, 351-0198, Japan

Correspondence: Laurent Badel - laurent@brain.riken.jp

BMC Neuroscience 2016, 17(Suppl 1):O20

Many animals rely on olfactory cues to make perceptual decisions and navigate the environment. In the brain, odorant molecules are sensed by olfactory receptor neurons (ORNs), which convey olfactory information to the central brain in the form of sequences of action potentials. In many organisms, axons of ORNs expressing the same olfactory receptor converge to one or a few glomeruli in the first central region (the antennal lobe in insects and the olfactory bulb in fish and mammals) where they make contact with their postsynaptic targets. Therefore, each glomerulus can be considered as a processing unit that relays information from a specific type of receptor. Because different odorants recruit different sets of glomeruli, and most glomeruli respond to a wide array of odors, olfactory information at this stage of processing is contained in spatiotemporal patterns of glomerular activity. How these patterns are decoded by the brain to guide odor-evoked behavior, however, remains largely unknown.

In Drosophila, attraction and aversion to specific odors have been linked to the activation of one or a few glomeruli (reviewed in [1]) in the antennal lobe (AL). These observations suggest a “labeled-line” coding strategy, in which individual glomeruli convey signals of specific ethological relevance, and their activation triggers the execution of hard-wired behavioral programs. However, because these studies used few odorants, and a small fraction of glomeruli were tested, it is unclear how the results generalize to broader odor sets, and whether similar conclusions hold for each of the ~50 glomeruli of the fly AL. Moreover, how compound signals from multiple glomeruli are integrated is poorly understood.

Here, we combine optical imaging, behavioral and statistical techniques to address these questions systematically. Using two-photon imaging, we monitor Ca2+ activity in the AL in response to 84 odors. We next screen behavioral responses to the same odorants. Comparing these data allows us to formulate a decoding model describing how olfactory behavior is determined by glomerular activity patterns in a quantitative manner. We find that a weighted sum of normalized glomerular responses recapitulates the observed behavior and predicts responses to novel odors, suggesting that odor valence is not determined solely by the activity a few privileged glomeruli. This conclusion is supported by genetic silencing and optogenetic activation of individual ORN types, which are found to evoke modest biases in behavior in agreement with model predictions. Finally, we test the model prediction that the relative valence of a pair of odors depends on the identity of other odors presented in the same experiment. We find that the relative valence indeed changes, and may even switch, suggesting that perceptual decisions can be modulated by the olfactory context. Surprisingly, our model correctly captured both the direction and the magnitude of the observed changes. These results indicate that the valence of olfactory stimuli is decoded from AL activity by pooling contributions over a large number of glomeruli, and highlight the ability of the olfactory system to adapt to the statistics of its environment, similarly to the visual and auditory systems.

  1. 1.

    Li Q, Liberles SD. Aversion and attraction through olfaction. Curr Biol. 2015;25(3):R120–9.


P1 Neural network as a scale-free network: the role of a hub

B. Kahng1

1Department of Physics and Astronomy, Seoul National University, 08826, Korea

Correspondence: B. Kahng - bkahng@snu.ac.kr

BMC Neuroscience 2016, 17(Suppl 1):P1

Recently, increasing attention has been drawn to human neuroscience in network science communities. This is because recent fMRI and anatomical experiments have revealed that neural networks of normal human brain are scale-free networks. Thus, accumulated knowledges in a broad range of network sciences can be naturally applied to neural networks to understand functions and properties of normal and disordered human brain networks. Particularly, the degree exponent value of the human neural network constructed from the fMRI data turned out to be approximately two. This value has particularly important meaning in scale-free networks, because the number of connections to neighbors of a hub becomes largest and thus functional role of the hub becomes extremely important. In this talk, we present the role of the hub in pattern recognition and dynamical problems in association with neuroscience.

P2 Hemodynamic responses to emotions and decisions using near-infrared spectroscopy optical imaging

Nicoladie D. Tam1

1Department of Biological Sciences, University of North Texas, Denton, TX 76203, USA

Correspondence: Nicoladie D. Tam - nicoladie.tam@unt.edu

BMC Neuroscience 2016, 17(Suppl 1):P2

This study focuses on the relationship between the emotional response, decision and the hemodynamic responses in the prefrontal cortex. This is based on the computational emotional model that hypothesizes the emotional response is proportional to the discrepancy between the expectancy and the actuality. Previous studies had shown that emotional responses are related to decisions [1, 2]. Specifically, the emotional responses of happy [3], sad [4], angry [5], jealous [6] emotions are proportional to the discrepancy between what one wants and what one gets [1, 3–7].

Methods Human subjects are asked to perform the classical behavioral economic experiment called Ultimatum Game (UG) [8]. This experimental paradigm elicits the interrelationship between decision and emotion in human subjects [3–6]. The hemodynamic responses of the prefrontal cortex were recorded while the subjects performed the UG experiment.

Results The results showed that the hemodynamic response, which corresponds to the neural activation and deactivation based on the metabolic activities of the neural tissues, are proportional to the emotional intensity and the discrepancy between the expectancy and the actuality. This validates the hypothesis of the proposed emotional theory [9–11] that the intensity of emotion is proportional to the disparity between the expected and the actual outcomes. These responses are also related to the fairness perception [7], with respect to the survival functions [9, 10] similar to the responses established for happy [1] emotion, and for fairness [12] experimentally. This is consistent with the computational relationship between decision and fairness [13].

  1. 1.

    Tam ND. Quantification of happy emotion: dependence on decisions. Psychol Behav Sci. 2014;3(2):68–74.

  2. 2.

    Tam ND. Rational decision-making process choosing fairness over monetary gain as decision criteria. Psychol Behav Sci. 2014;3(6–1):16–23.

  3. 3.

    Tam ND. Quantification of happy emotion: Proportionality relationship to gain/loss. Psychol Behav Sci. 2014;3(2):60–7.

  4. 4.

    Tam ND: Quantitative assessment of sad emotion. Psychol Behav Sci 2015, 4(2):36-43.

  5. 5.

    Tam DN. Computation in emotional processing: quantitative confirmation of proportionality hypothesis for angry unhappy emotional intensity to perceived loss. Cogn Comput. 2011;3(2):394–415.

  6. 6.

    Tam ND, Smith KM. Cognitive computation of jealous emotion. Psychol Behav Sci. 2014;3(6–1):1–7.

  7. 7.

    Tam ND. Quantification of fairness perception by including other-regarding concerns using a relativistic fairness-equity model. Adv Soc Sci Research J. 2014;1(4):159–69.

  8. 8.

    von Neumann J, Morgenstern O, Rubinstein A. Theory of games and economic behavior. Princeton: Princeton University Press; 1953.

  9. 9.

    Tam D. EMOTION-I model: A biologically-based theoretical framework for deriving emotional context of sensation in autonomous control systems. Open Cybern Syst J. 2007;1:28–46.

  10. 10.

    Tam D. EMOTION-II model: a theoretical framework for happy emotion as a self-assessment measure indicating the degree-of-fit (congruency) between the expectancy in subjective and objective realities in autonomous control systems. Open Cybern Syst J. 2007;1:47–60.

  11. 11.

    Tam ND. EMOTION-III model. A theoretical framework for social empathic emotions in autonomous control systems. Open Cybern Syst J. 2016 (in press).

  12. 12.

    Tam ND: Quantification of fairness bias in relation to decisions using a relativistic fairness-equity model. Adv in Soc Sci Research J 2014, 1(4):169-178.

  13. 13.

    Tam ND. A decision-making phase-space model for fairness assessment. Psychol Behav Sci. 2014;3(6–1):8–15.


P3 Phase space analysis of hemodynamic responses to intentional movement directions using functional near-infrared spectroscopy (fNIRS) optical imaging technique

Nicoladie D. Tam1, Luca Pollonini2, George Zouridakis3

1Department of Biological Sciences, University of North Texas, Denton, TX 76203, USA; 2College of Technology, the University of Houston, TX, 77204, USA; 3Departments of Engineering Technology, Computer Science, and Electrical and Computer Engineering, University of Houston, Houston, TX, 77204, USA

Correspondence: Nicoladie D. Tam - nicoladie.tam@unt.edu

BMC Neuroscience 2016, 17(Suppl 1):P3

We aim to extract the intentional movement directions of the hemodynamic signals recorded from noninvasive optical imaging technique, such that a brain-computer-interface (BCI) can be built to control a wheelchair based on the optical signals recorded from the brain. Real-time detection of neurodynamic signals can be obtained using functional near-infrared spectroscopy (fNIRS), which detects both oxy-hemoglobin (oxy-Hb) and deoxy-hemoglobin (deoxy-Hb) levels in the underlying neural tissues. In addition to the advantage of real-time monitoring of hemodynamic signals using fNIRS over fMRI (functional magnetic resonance imaging), fNIRS also can detect brain signals of human subjects in motion without any movement artifacts. Previous studies had shown that hemodynamic responses are correlated with the movement directions based on the temporal profiles of the oxy-Hb and deoxy-Hb levels [1–5]. In this study, we will apply a phase space analysis to the hemodynamic response to decode the movement directions instead of using the temporal analysis in the previous studies.

Methods In order to decode the movement directions, human subjects were asked to execute two different orthogonal directional movements in the front-back and right-left directions while the optical hemodynamic responses were recorded in the motor cortex of the dominant hemisphere. We aim to decode the intentional movement directions without a priori any assumption on how arm movement directions are correlated with the hemodynamic signals. Therefore, we used the phase space analysis to determine how the trajectories of oxy-Hb and deoxy-Hb are related to each other during these arm movements.

Results The results show that there are subpopulations of cortical neurons that are task-related to the intentional movement directions. Specifically, using phase space analysis of the oxy-Hb and deoxy-Hb levels, opposite movement direction is represented by the different hysteresis of the trajectories in opposite direction in the phase space. Since oxy-Hb represents the oxygen delivery and deoxy-Hb represents the oxygen extraction by the underlying brain tissues, the phase space analysis provides a means to differentiate the movement direction by the ratio between oxygen delivery and oxygen extraction. In other words, the oxygen demands in the subpopulation of neurons in the underlying tissue differ depending on the movement direction. This also corresponds to the opposite patterns of neural activation and deactivation during execution of opposite movement directions. Thus, phase space analysis can be used as an analytical tool to differentiate different movement directions based on the trajectory of the hysteresis with respect to the hemodynamic variables.

  1. 1.

    Tam ND, Zouridakis G. Optical imaging of motor cortical activation using functional near-infrared spectroscopy. BMC Neurosci. 2012;13(Suppl 1):P27.

  2. 2.

    Tam ND, Zouridakis G. Optical imaging of motor cortical hemodynamic response to directional arm movements using near-infrared spectroscopy. Int J Biol Eng. 2013;3(2):11–17.

  3. 3.

    Tam ND, Zouridakis G. Decoding of movement direction using optical imaging of motor cortex. BMC Neurosci. 2013; P380.

  4. 4.

    Tam ND, Zouridakis G. Temporal decoupling of oxy- and deoxy-hemoglobin hemodynamic responses detected by functional near-infrared spectroscopy (fNIRS). J Biomed Eng Med Imaging. 2014;1(2):18–28.

  5. 5.

    Tam ND, Zouridakis G. Decoding movement direction from motor cortex recordings using near-infrared spectroscopy. In: Infrared spectroscopy: theory, developments and applications. Hauppauge: Nova Science; 2014.


P4 Modeling jamming avoidance of weakly electric fish

Jaehyun Soh1, DaeEun Kim1

1Biological Cybernetics, School of Electrical and Electronic Engineering, Yonsei University, Shinchon, Seoul, 120-749, South Korea

Correspondence: DaeEun Kim - daeeun@yonsei.ac.kr

BMC Neuroscience 2016, 17(Suppl 1):P4

Weakly electric fish use electric field generated by the electric organ in the tail of the fish. They detect objects by sensing the electric field with electroreceptors on the fish’s body surface. Obstacles in the vicinity of the fish distort the electric field generated by the fish and the fish detect this distortion to recognize environmental situations. Generally, weakly electric fish produce species-dependent electric organ discharge (EOD) signals. Frequency bands of the fish’s signals include a variety of frequencies, 50–600 Hz or higher than 800 Hz. The EOD signals can be disturbed by similar frequency signals emitted by neighboring weakly electric fish. They change their EOD frequencies to avoid jamming signals when they detect the interference of signals. This is called jamming avoidance response (JAR).

Electroreceptors of the fish read other electric fish’s EOD while they sense their own EOD. Therefore, when two weakly electric fish are close enough and they sense similar frequencies, their sensing ability by EOD is impaired because of signal jamming [1, 2]. The fish lowers its EOD frequency in response to the jamming signals when a slightly higher frequency of signals are detected and otherwise, raises its EOD. This response is shown in Fig. 10. The fish shift their EOD frequency almost immediately without trial and error.
Fig. 10

Jamming avoidance response

The method of how to avoid jamming has been studied for a long time, but the corresponding neural mechanisms have not been revealed yet so far. The JAR of Eigenmannia can be analyzed by Lissajous graphs which consist of amplitude modulations and differential phase modulations. Relative intensity of signals at each skin can show that the signal frequency is higher than its own signal frequency or lower [3].

We suggest an algorithm of jamming avoidance for EOD signals, especially for wave-type fish. We explore the diagram of amplitude modulation versus phase modulation, and analyze the shape over the graph. The phase differences or amplitude differences will contribute to the estimation of the signal jamming situation. From that, the jammed signal frequency can be detected and so it can guide the jamming avoidance response. It can provide a special measure to predict the jamming avoidance response. However, what type of neural structure is available in weakly electric fish is an open question. We need further study on this subject.

Acknowledgements: This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MEST) (No. 2014R1A2A1A11053839).

  1. 1.

    Heiligenberg W. Electrolocation of objects in the electric fish eigenmannia (rhamphichthyidae, gymnotoidei). J Comp Physiol. 1973;87(2):137–64.

  2. 2.

    Heiligenberg W. Principles of electrolocation and jamming avoidance in electric fish. Berlin: Springer; 1977.

  3. 3.

    Heiligenberg W. Neural nets in electric fish. Cambridge: MIT Press; 1991.


P5 Synergy and redundancy of retinal ganglion cells in prediction

Minsu Yoo1, S. E. Palmer1,2

1Committee on Computational Neuroscience, University of Chicago, Chicago, IL, USA; 2Department of Organismal Biology and Anatomy, University of Chicago, Chicago, IL, USA

Correspondence: Minsu Yoo - minsu@uchicago.edu

BMC Neuroscience 2016, 17(Suppl 1):P5

Recent work has shown that retina ganglion cells (RGC) of salamanders predict future sensory information [1]. It has also been shown that these RGC’s carry significant information about the future state of their own population firing patterns [2]. From the perspective of downstream neurons in the visual system that do not have independent access to the visual scene, the correlations in the RGC firing, itself, may be important for predicting the future visual input. In this work, we explore the structure of the generalized correlation in firing patterns in the RGC, with a particular focus on coding efficiency. From the perspective of efficient neural coding, we might expect neurons to code for their own future state independently (decorrelation across cells), and to have very little predictive information extending forward in time (decorrelation in the time domain).

In this work, we quantify whether neurons in the retina code for their own future input independently, redundantly, or synergistically, and how long these correlations persist in time. We use published extracellular multi-electrode data from the salamander retina in response to repeated presentations of a natural movie [1]. We find significant mutual information in the population firing that is almost entirely independent except at very short time delays, where the code is weakly redundant (Fig. 11). We also find that the information persists to delays of up to a few 100 ms. In addition, we find that individual neurons vary widely in the amount of predictive information they carry about the future population firing state. This heterogeneity may contribute to the diversity of predictive information we find across groups in this experiment.
Fig. 11

Predictive information in the retinal response is coded for independently. Red the mutual information between the binary population firing patterns at times t and t + Δt, for 1000 randomly selected groups of 5 cells from our 31-cell population. Time is binned in 16.67 ms bins, and the (rare) occurrence of two spikes in a bin is recorded as a ‘1’. Blue the sum of the mutual information between a single cell response at time t and the future response of the group at time t + Δt. Error bars indicate the standard error of the mean across groups. All information quantities are corrected for finite-size effects using quadratic extrapolation [3]

The results in this study may provide useful information for building a model of the RGC population that can explain why redundant coding is only observed at short delays, or what makes one RGC more predictive than another. Building this type of model will illustrate how the retina represents the future.

  1. 1.

    Palmer SE, Marre O, Berry MJ, Bialek W. Predictive information in a sensory population. Proc. Natl. Acad. Sci. 2015;112:6908–13.

  2. 2.

    Salisbury J, Palmer SE. Optimal prediction and natural scene statistics in the retina. ArXiv150700125 Q-Bio [Internet]. 2015 [cited 2016 Feb 25]; Available from: http://arxiv.org/abs/1507.00125.

  3. 3.

    Panzeri S, Senatore R, Montemurro MA, Petersen RS. Correcting for the sampling bias problem in spike train information measures. J. Neurophysiol. 2007;98:1064–72.


P6 A neural field model with a third dimension representing cortical depth

Viviana Culmone1, Ingo Bojak1

1School of Psychology, University of Reading, Reading, Berkshire, RG1 6AY, UK

Correspondence: Viviana Culmone - v.culmone@pgr.reading.ac.uk

BMC Neuroscience 2016, 17(Suppl 1):P6

Neural field models (NFMs) characterize the average properties of neural ensembles as a continuous excitable medium. So far, NFMs have largely ignored the extension of the dendritic tree, and its influence on the neural dynamics [1]. As shown in Fig. 12A, we implement a 3D-NFM, including the dendritic extent through the cortical layers, starting from a well-known 2D-NFM [2]. We transform the equation for the average membrane potential h e for the point-like soma in the 2D-NFM [2] to a full cable equation form (added parts in bold):
Fig. 12

A The 3D-NFM adds a dendritic dimension to the 2D one [1]. One single macrocolumn has inhibitory (I) and excitatory (E) subpopulations. B (Top) Discretization of the dendrite. (Bottom) Equilibrium membrane potential along the dendrite for two different synaptic inputs. C PSDs of he for the 2D- and 3D-NFM. Increasing the synaptic input recovers the lost alpha rhythm

$$ \begin{aligned} \tau_{e} \frac{{\partial h_{e} (x,z,t)}}{\partial t} & = - \left[ {h_{e} (x,z,t) - h_{e}^{r} } \right] +\varvec{\lambda}^{2} \frac{{\varvec{\partial }^{2} \varvec{h}_{\varvec{e}} (\varvec{x},\varvec{z},\varvec{t})}}{{\varvec{\partial z}^{2} }} \\ & \quad + \varvec{f}_{{\varvec{syn}}} \sum\limits_{k} {\psi_{ke} (h_{e} )I_{ke} (x,z,t)} \\ \end{aligned} $$

The 3D-NFM is modeled considering the dendritic tree as a single linear cable. Figure 12B shows the resulting resting potential along the extended dendrite for synaptic input in two different locations. Naively keeping the parameters of the 2D-NFM for the 3D-NFM results in a power spectral density (PSD) without an alpha rhythm resonance, see Fig. 12C. However, increasing the synaptic input by a factor f syn can compensate for the dispersion along the dendrite and recovers the peak in the alpha band. We study the influence of varying the distribution of synaptic inputs along the dendritic (vertical) dimension and of changing the (horizontal) area of the simulated cortical patch. We also provide an outlook on how to compare our results with local field potential recordings from real cortical tissues. We expect that 3D-NFMs will be used widely in the future for describing such experimental data, and that the methods used to extend the specific 2D-NFM used here [2] will generalize to other 2D-NFMs.

  1. 1.

    Spruston N. Pyramidal neurons: dendritic structure and synaptic integration. Nat Rev Neurosci. 2008;9:206–221.

  2. 2.

    Bojak I, Liley DTJ. Modeling the effects of anesthesia on the electroencephalogram. Phys Rev E. 2005;71:041902.


P7 Network analysis of a probabilistic connectivity model of the Xenopus tadpole spinal cord

Andrea Ferrario1, Robert Merrison-Hort1, Roman Borisyuk1

1School of Computing and Mathematics, Plymouth University, Plymouth, PL4 8AA, United Kingdom

Correspondence: Andrea Ferrario - andrea.ferrario@plymouth.ac.uk

BMC Neuroscience 2016, 17(Suppl 1):P7

Our previous results [1, 2] describe a computational anatomical model of the Xenopus tadpole spinal cord which includes about 1400 neurons of seven types allocated on two sides of the body. This model is based on a developmental approach, where axon growth is simulated and synapses are created (with some probability) when axons cross dendrites. A physiological model of spiking neurons with the generated connectivity of about 85,000 synapses produces a very reliable swimming pattern of anti-phase oscillations in response to simulated sensory input [2].

Using the developmental model we generate 100 different sets of synaptic connections (“connectomes”), and use this information to create a generalized probabilistic model. The probabilistic model provides a new way to easily generate tadpole connectomes and, remarkably, these connectomes produce similar simulated physiological behavior to those generated using the more complex developmental approach (e.g. they swim when stimulated). Studying these generated connectivity graphs allows us to analyze the structure of connectivity in a typical tadpole spinal cord.

Many complex neuronal networks have been found to have “small world” properties, including those in the nematode worm C. elegans [3, 6], cat and macaque cortex and the human brain [4]. Small world networks are classified between regular and random networks, and are characterized by a high value of the clustering coefficient C and a relatively small value of the average path length L, when compared with Erdős-Rényi and degree matched graphs of a similar size. We used graph theory tools to calculate the strongly connected component of each network, which was then used to measure C and L. For the degree-matched network, these computations have been based on finding the probabilistic generating function [5]. By comparing these measures with those of degree matched random graphs, we found that tadpole’s network can be considered a small world graph. This is also true for the sub-graph consisting only of neurons on one side of the body, which displays properties very similar to those of the C. elegans network. Another important subgraph, comprising only the two main neuron types in the central pattern generator (CPG) network also shows small world properties, but is less similar to the C. elegans network.

Our approach allows us to study the general properties of the architecture of the tadpole spinal cord, even though in reality the actual network varies from individual to individual (unlike in C. elegans). This allows us to develop ideas about the organizing principles of the network, as well as to make predictions about the network’s functionality that can be tested first in computer simulations and later in real animal experiments. In this work we combine several graph theory techniques in a novel way to analyze the structure of a complex neuronal network where not all biological details are known. We believe that this approach can be applied widely to analyze other animals’ nervous systems.

  1. 1.

    Borisiuk R, al Azad AK, Conte D, Roberts A, Soffe SR. A developmental approach to predicting neuronal connectivity from small biological datasets: a gradient-based neuron growth model. PloS One. 2014;9(2):e89461.

  2. 2.

    Roberts A, Conte A, Hull M, Merrison-Hort R, al Azad AK, Buhl E, Borisyuk R, Soffe SR. Can simple rules control development of a pioneer vertebrate neuronal network generating behavior? J Neurosci. 2014;34(2):608–21.

  3. 3.

    Watts DJ, Strogatz SH. Collective dynamics of ‘small-world’ networks. Nature. 1998;440–2.

  4. 4.

    Kaiser M. A tutorial in connectome analysis: topological and spatial features of brain networks. NeuroImage. 2011;892–907.

  5. 5.

    Newman MEJ, Strogatz SH, Watts DJ. Random graphs with arbitrary degree distribution and their applications. Phys. Rev. 2001;E64:026118.

  6. 6.

    Vershney LR, Chen BL, Paniagua E, Hall DH Chklovskii DB. Structural properties of the Caenorhabditis elegans neuronal network. PloS Comput Biol. 2011;7(2):e1001066.


P8 The recognition dynamics in the brain

Chang Sub Kim1

1Department of Physics, Chonnam National University, Gwangju, 61186, Republic of Korea

Correspondence: Chang Sub Kim - cskim@jnu.ac.kr

BMC Neuroscience 2016, 17(Suppl 1):P8

Over the years an extensive research endeavor has been given to understanding the brain’s cognitive function in a unified principle and to providing a formulation of the corresponding computational scheme of the brain [1]. The explored free-energy principle (FEP) claims that the brain’s operation on perception, learning, and action rests on brain’s internal mechanism of trying to avoid aberrant events encountering in its habitable environment. The theoretical measure for this biological process has been suggested to be the informational free-energy (IFE). The computational actualization of the FEP is carried out via the gradient descent method (GDM) in machine learning theory.

The information content of the cognitive processes is encoded in the biophysical matter as spatiotemporal patterns of the neuronal correlates of the external causes. Therefore, any realistic attempt to account for the brain function must conform to the physics laws and the underlying principles. Notwithstanding the grand simplicity, however, the FEP framework embraces some extra-physical constructs. Two major such extra-physical constructs are the generalized motions, which are non-Newtonian objects, and the GDM in executing the brain’s computational mechanism of perception and active inference. The GDM is useful in finding mathematical solutions in the optimal problems, but not derived from a physics principle.

In this work, we cast the FEP in the brain science into the framework of the principle of least action (PLA) in physics [2]. The goal is to remove the extra-physical constructs embedded in the FEP and to reformulate the GDM within the standard mechanics arena. Previously, we suggested setting up the minimization scheme of the IFE in the Lagrange mechanics formalism [3] which contained only primitive results. In the present formulation we specify the IFE as the information-theoretic Lagrangian and thus formally define the informational action (IA) as time-integral of the IFE. Then, the PLA prescribes that the viable brain minimizes the IA when encountering uninhabitable events by selecting an optimal path among all possible dynamical configurations in the brain’s neuronal network. Specifically, the minimization yields the mechanistic equations of motion of the brain states, which are inverting algorithms of sensory inputs to infer their external causes. The obtained Hamilton–Jacobi–Bellman-type equation prescribes the brain’s recognition dynamics which do not require the extra-physical concept of higher order motions. Finally, a neurobiological implementation of the algorithm is presented which complies with the hierarchical, operative structure of the brain. In doing so, we adopt the local field potential and the local concentration of ions in the Hodgkin–Huxley model as the effective brain states [4]. Thus, the brain’s recognition dynamics is operatively implemented in a neuro-centric picture. We hope that our formulation, conveying a wealth of structure as an interpretive and mechanistic description of explaining how the brain’s cognitive function may operate, will provide with a helpful guidance for future simulation.

  1. 1.

    Friston K. The free-energy principle: a unified brain theory? Nat Reivew Neurosci. 2010;11:127–38.

  2. 2.

    Landau LP. Classical mechanics. 2nd ed. NewYork: Springer; 1998.

  3. 3.

    Kim CS. The adaptive dynamics of brains: Lagrangian formulation. Front Neurosci Conf Abstr Neuroinform. 2010. doi:10.3389/conf.fnins.2010.13.00046.

  4. 4.

    Hodgkin A, Huxley A. A quantitative description of membrane current and its application to conduction and excitation in nerve. J Physiol. 1952;117:500–44.


P9 Multivariate spike train analysis using a positive definite kernel

Taro Tezuka1

1Faculty of Library, Information and Media Science, University of Tsukuba, Tsukuba, 305-0821, Japan

Correspondence: Taro Tezuka - tezuka@slis.tsukuba.ac.jp

BMC Neuroscience 2016, 17(Suppl 1):P9

Multivariate spike trains, obtained by recording multiple neurons simultaneously, is a key to uncovering information representation in the brain [1]. Other expressions used to refer to the same type of data include “multi-neuron spike train” [2] and “parallel spike train’” [3]. One approach to analyze spike trains is to use kernel methods, which are known to be among the most powerful machine learning methods. Kernel methods rely on defining a symmetric positive-definite kernel suited to the given data. This work proposes a general way of extending kernels on univariate (or single-unit) spike trains to multivariate spike trains.

In this work, the mixture kernel, which naturally extends a kernel defined on univariate spike trains, is proposed and evaluated. There are many univariate spike train kernels proposed [4–9], and the mixture kernel is applicable to any of these kernels. Considered abstractly, a multivariate spike train is a set of time points at which different types of events occurred. In other words, it is a sample taken from a marked point process. The method proposed in this paper is therefore applicable to other data with the same structure.

The mixture kernel is defined as a linear combination of symmetric positive-definite kernels on the components of the target data structure, in this case univariate spike trains. The name “mixture kernel” derives from the common use of the word “mixture” to indicate a linear combination in physics and machine learning, for example in Gaussian mixture models. One can prove that the mixture kernel is symmetric positive-definite if coefficient matrix of the mixture is a symmetric positive-semidefinite matrix.

The performance of the mixture kernel was evaluated by kernel ridge regression for estimating the value of the parameter for generating synthetic spike train data, and also the stimulus given to the animal as the spike trains were recorded. For synthetic data, multivariate spike trains were generated using homogenous Poisson processes. For real data, the pvc-3 data set [2] in the CRCNS (Collaborative Research in Computational Neuroscience) data sharing website was used, which is a 10-unit multivariate spike trains recorded from the primary visual cortex of a cat.

Acknowledgement: This work was supported in part by JSPS KAKENHI Grant Numbers 21700121, 25280110, and 25540159.

  1. 1.

    Gerstner W, Kistler WM, Naud R, Paninski L. Neuronal dynamics. Cambridge: Cambridge University Press; 2014.

  2. 2.

    Blanche T. Multi-neuron recordings in primary visual cortex, CRCNS.org; 2009.

  3. 3.

    Grun S, Rotter S. Analysis of parallel spike trains. Berlin: Springer; 2010.

  4. 4.

    Paiva A, Park IM, Principe JC. A reproducing kernel Hilbert space framework for spike train signal processing, Neural Comput. 2009;21(2):424–49.

  5. 5.

    Park IM, Seth S, Rao M, Principe JC. Strictly positive definite spike train kernels for point process divergences. Neural Comput. 2012;24:2223–50.

  6. 6.

    Park IM, Seth S, Paiva A, Li L, Principe JC. Kernel methods on spike train space for neuroscience: a tutorial. Signal Process Mag. 2013;30(4):149–60.

  7. 7.

    Li L, Park IM, Brockmeier AJ, Chen B, Seth S, Francis JT, Sanchez JC, Principe JC. Adaptive inverse control of neural spatiotemporal spike patterns with a reproducing kernel Hilbert space (RKHS) framework. IEEE Trans Neural Syst Rehabil Eng. 2013;21(4):532–43.

  8. 8.

    Shpigelman L, Singer Y, Paz R, Vaadia E. Spikernels: embedding spiking neurons in inner product spaces. Adv Neural Inf Process Syst. 2003;15:125–32.

  9. 9.

    Eichhorn J, Tolias A, Zien A, Kuss M, Rasmussen CE, Weston J, Logothetis N, Scholkopf B. Prediction on spike data using kernel algorithms. Adv Neural Inf Process Syst. 2004;16:1367–74.


P10 Synchronization of burst periods may govern slow brain dynamics during general anesthesia

Pangyu Joo1

1Physics, POSTECH, Pohang, 37673, Republic of Korea

Correspondence: Pangyu Joo - pangyu32@postech.ac.kr

BMC Neuroscience 2016, 17(Suppl 1):P10

Researchers have utilized electroencephalogram (EEG) as an important key to study brain dynamics in general anesthesia. Representative features of EEG in deep anesthesia are slow wave oscillation and burst suppression [1], and they have so different characteristics that they seem to have different origins. Here, we propose that the two feature may be a different aspect of same phenomenon and show that the slow oscillation could arise from partial synchronization of bursting periods. To model the synchronization of burst periods, modified version of Ching’s model of burst suppression [2] is used. 20 pyramidal neurons and 20 fast spiking neurons are divided into 10 areas composed of 2 pyramidal and 2 fast spiking neurons so that each area exhibit burst suppression behavior independently. Then, all the pyramidal neurons are all to all connected and the connection strength modulates the amount of synchronization of burst periods. The action potentials of pyramidal neurons are substituted by 1 when the action potential larger than 0, and all other case 0. Then they are averaged over the neurons and convoluted with 50 ms square function to see the collective activity of the neurons. As shown in Fig. 13A, At high level of ATP recovery rate (JATP > 1), there are no suppression period so that slow oscillation does not appear regardless of synchronization. At low level of ATP recovery rate (JATP = 0.5), we can observe that the slow oscillation appears with increasing amplitude and finally become burst suppression as relative connection strength increases (Fig. 13B). When the ATP recovery rate is 0, then the pyramidal neurons do not fire at all. These results suggest that the burst period synchronization model could explain some important features of EEG during general anesthesia: the increasing slow oscillation amplitude as anesthesia deepen, significantly high activity in bursting period, and the peak max phase amplitude coupling in deep anesthesia.
Fig. 13

A The convoluted signal with different ATP recovery rates (JATP) and relative connection strengths (C). B Standard deviation of the convoluted signals

  1. 1.

    Purdon PL, Pierce ET, Mukamel EA, et al. Electroencephalogram signatures of loss and recovery of consciousness from propofol. PNAS. 2013;110(12):E1142–51.

  2. 2.

    Ching S, Purdon PL, Vijayan S, Kopell NJ, Brown EN. A neurophysiological–metabolic model for burst suppression. PNAS. 2012;109(8):3095–100.


P11 The ionic basis of heterogeneity affects stochastic synchrony

Young-Ah Rho1,4, Shawn D. Burton2,3, G. Bard Ermentrout1,3, Jaeseung Jeong4, Nathaniel N. Urban2,3

1Department of Mathematics, University of Pittsburgh, Pittsburgh, PA, USA 15260; 2Department of Biological Sciences, Carnegie Mellon University, Pittsburgh, PA, USA 15213; 3Center for the Neural Basis of Cognition, Pittsburgh, PA, USA 15213; 4Department of Bio and Brain Engineering/Program of Brain and Cognitive Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, South Korea 34141

Correspondence: Young-Ah Rho - yarho75@gmail.com

BMC Neuroscience 2016, 17(Suppl 1):P11

Synchronization in neural oscillations is a prominent feature of neural activity and thought to play an important role in neural coding. Theoretical and experimental studies have described several mechanisms for synchronization based on coupling strength and correlated noise input. In the olfactory systems, recurrent and lateral inhibition mediated by dendrodendritic mitral cell–granule cell synapses are critical for synchronization, and intrinsic biophysical heterogeneity reduce the ability to synchronize. In our previous study, a simple phase model was used to examine how physiological heterogeneity in biophysical properties and firing rates across neurons affects correlation-induced synchronization (stochastic synchrony). It has showed that heterogeneity in the firing rates and in the shapes of the phase response curves (PRCs) reduced output synchrony. In this study, we extend the previous phase model to a conductance based model to examine how the density of specific ion channels in mitral cells impacts on stochastic synchrony. A recent study revealed that mitral cells are highly heterogeneous in the expression of the sag current, a hyperpolarization-activated inward current (Angelo, 2011). The variability in the sag contributes to the diversity of mitral cells and thus we wanted to know how this variability influences synchronization. Mitral cell oscillations and bursting are also regulated by an inactivating potassium current (IA). Based on these ion channels, we examined the effect of changing the current densities (gA, gH) on diversity of PRCs and of synchrony. In order to identify oscillatory patterns of bursting and repetitive spiking across gA and gH to the model, two parameter bifurcation analysis was performed in the presence and absence of noise. Increasing gH alone reduces the region of bursting, but does not completely eliminate bursting, and PRCs changed much more with respect to gA than gH. Focusing on varying gA, we next examined a role of gA density and firing rate in stochastic synchrony by introducing the fluctuating correlated input resembling the shared presynaptic drives. We found that heterogeneity in A-type current mainly influenced on stochastic synchrony as we predicted in PRCs investigated theoretically, and diversity in firing rate alone didn’t account for it. In addition, heterogeneous population with respect to gA, given decent amount of gA density, showed better stochastic synchrony than homogeneous population in same firing rate.

P12 Circular statistics of noise in spike trains with a periodic component

Petr Marsalek1,2

1Institute of Pathological Physiology, First Faculty of Medicine, Charles University in Prague, 128 53, Czech Republic; 2Czech Technical University in Prague, Zikova 1903/4, 166 36, Czech Republic

Correspondence: Petr Marsalek - petr.marsalek@lf1.cuni.cz

BMC Neuroscience 2016, 17(Suppl 1):P12

Introduction We estimate parameters of the inter-spike interval distributions in binaural neurons of the mammalian sound localization neural circuit, neurons of the lateral and medial superior olive [1]. We present equivalent descriptions of spike time probabilities using both standard and circular statistics. We show that the difference between sine function and beta density in the circular domain is negligible.

Results Estimation of the spike train probability density function parameters is presented in relation to harmonic and complex sound input. The resulting densities are expressed analytically with the use of harmonic and Bessel functions. Parameter fits are verified by numerical simulations of spike trains (Fig. 14).
Fig. 14

Comparison of circular probability density functions of sine and beta density. A Beta density with parameters a = b = 3.3818, matches closely that of the sine function, used as a probability density function (PDF). Beta density with parameters a = b = 3 solid line, is matched by sine function y = 1.05 − 1.1 cos(2π x/1.1). B Cumulative distribution function (CDF) is shown for these densities together with the difference between the two CDFs multiplied by 100 to visualize the comparison of the two distributions. C For testing different vector strengths we use uniform distributions with pre-set vector strengths (ρ = 0.8, 0.5 and 0.08)

Conclusions We use analytical techniques, where it is possible. We calculate the one-to-one correspondence of vector strength parameters and parameters of circular distributions used for description of data. We show here introductory figure of our paper with the two representative circular densities. We also use experimental data [2, 3] and simulated data to compare them with these theoretical distributions.

Acknowledgements: Supported by the PRVOUK program no. 205024 at the Charles University in Prague. I acknowledge contributions to the analytical computations by Ondrej Pokora and simulation in Matlab by Peter G. Toth.

  1. 1.

    Bures Z, Marsalek P. On the precision of neural computation with interaural level differences in the lateral superior olive. Brain Res. 2013;1536:16–26.

  2. 2.

    Joris P, Carney L, Smith P, Yin T. Enhancement of neural synchronization in the anteroventral cochlear nucleus. I. Responses to tones at the characteristic frequency. J Neurophysiol. 1994;71(3):1022–36.

  3. 3.

    Joris P, Smith P, Yin T. Enhancement of neural synchronization in the anteroventral cochlear nucleus. II. Responses in the tuning curve tail. J Neurophysiol. 1994;71(3):1037–51.


P14 Representations of directions in EEG-BCI using Gaussian readouts

Hoon-Hee Kim1,2, Seok-hyun Moon3, Do-won Lee3, Sung-beom Lee3, Ji-yong Lee3, Jaeseung Jeong1,2

1Department of Bio and Brain Engineering and 2Program of Brain and Cognitive Engineering, College of Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, South Korea, 34141; 3Korea Science Academy of KAIST, Busan, South Korea, 10547

Correspondence: Jaeseung Jeong - jsjeong@kaist.ac.kr

BMC Neuroscience 2016, 17(Suppl 1):P14

EEG (electroencephalography) is one of most useful neuroimaging technology and best options for BCI (Brain-Computer Interface) because EEG has portable size, wireless and well-wearing design in any situations. The key objective of BCI is physical control of machine such as cursor movement in screen and robot movement [1, 2]. In previously study, the motor imagery had used for represent of direction to movement [1, 2]. For example, the left hand imagery mapping to move the left, the right hand imagery mapping to move the right and both hand imagery mapping to move the forward. In this study, however, we considered only brain signals when a subject thinks directions to movements not motor imageries. We designed the recurrent neural networks which consist of 300–10,000 artificial linear neurons using Echo State Networks paradigm [3]. We also recorded EEG signals using Emotiv EPOC+ which has 16 channels (AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4 and two of reference). All raw data of channels were normalized and then used inputs to recurrent neural networks. For representation of directions, we had built Gaussian readouts which has preferred directions and fitted the Gaussian functions (Fig. 15). The firing rate of readout were high when the subject thought preferred direction. However, when the subject thought not preferred direction, the firing rate of readout slightly low down. For implement these readouts, all of neuros in recurrent neural networks had linearly connected to all readouts and weights of these connections were trained by linear learning rules. In result, we considered 5 healthy subjects and recorded EEG signals for each directions. The readouts were showed well Gaussian fitted direction preference. In this study, we considered only two dimensions but many situations of BCI has three dimensional space. Therefore, our study which using Gaussian readouts should be extended to three dimensional version.
Fig. 15

Design of recurrent neural networks and readouts

  1. 1.

    Chae Y, Jeong J, Jo S. Toward brain-actuated humanoid robots: asynchronous direct control using an EEG-based BCI. IEEE Trans Robot. 2012;28(5):1131–44.

  2. 2.

    LaFleur K, Cassady K, Doud A, Shades K, Rogin E, He B. Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface. J Neural Eng. 2013;10(4):046003.

  3. 3.

    Jaeger H, Haas H. Harnessing nonlinearity predicting chaotic systems and saving energy in wireless communication. Science. 2004;304(5667):78–80.


P15 Action selection and reinforcement learning in basal ganglia during reaching movements

Yaroslav I. Molkov1, Khaldoun Hamade2, Wondimu Teka3, William H. Barnett1, Taegyo Kim2, Sergey Markin2, Ilya A. Rybak2

1Department of Mathematics and Statistics, Georgia State University, Atlanta, GA 30303, USA; 2Department of Neurobiology and Anatomy, Drexel University, Philadelphia, PA 19129, USA; 3Department of Mathematical Sciences, Indiana University – Purdue University, Indianapolis, IN 46202, USA

Correspondence: Yaroslav I. Molkov - ymolkov@gsu.edu

BMC Neuroscience 2016, 17(Suppl 1):P15

The basal ganglia (BG) comprise a number of interconnected nuclei that are collectively involved in a wide range of motor and cognitive behaviors. The commonly accepted theory is that the BG play a pivotal role in action selection and reinforcement learning facilitated by the activity of dopaminergic neurons of substantia nigra pars compacta (SNc). These dopaminergic neurons encode prediction errors when reward outcomes exceed or fall below anticipated values. The BG gate appropriate behaviors from multiple moto-cortical command candidates arriving at the striatum (BG’s input nuclei) but suppress competing inappropriate behaviors. The selected motor action is realized when the internal segment of the globus pallidus (GPi) (BG’s output nuclei) disinhibits thalamic neurons corresponding to the gated behavior. The BG network performs motor command selection through the facilitation of the appropriate behavior via the “direct” striatonigral (GO) pathway and inhibition of competing behaviors by the “indirect” striatopallidal (NOGO) pathway.

Several modeling studies have showed plausibility of the above concept in simplified cases, e.g. for binary action selection in response to a binary cue. However, in these previous models, the possible actions/behaviors were represented in an abstract way, and did not have a detailed implementation as specific neuronal patterns actuating the muscular-skeletal apparatus. To address these details, the motor system in the present study includes a 2D-biomechanical arm model in the horizontal plane to simulate realistic reaching movements. The arm consists of two segments (upper arm and forearm) and has two joints (shoulder and elbow) controlled by four monoarticular (flexor and extensor at each joint) and two bi-articular (shoulder and elbow flexor, and shoulder and elbow extensor) muscles. The neural component of the model includes the BG, the thalamus, the motor cortex, and spinal circuits. The low-level spinal circuitry contains six motoneurons (each controlling one muscle), and receives proprioceptor feedback from muscles. Cortical neurons provide inputs to the spinal network. Their activity is calculated by solving an inverse problem (inverting the internal model) based on the initial position of the arm, reaching distance and direction.

In the model, reaching movements in different directions were used as a set of possible behaviors. We simulated movements in response to a sensory cue defining the target arm position. The cortex generated signals corresponding to the cue and all possible motor commands and delivered these signals to the BG. The resulting neuronal patterns in the motor cortex were calculated as a convolution of the thalamic activity and all possible motor commands. The function of BG was to establish the association between the cue and the appropriate action(s) by adjusting weights of plastic corticostriatal projections through reinforcement learning. The BG model contained an exploratory mechanism, operating through the subthalamic nucleus (STN) that allowed the model to constantly seek better cue-action associations that deliver larger rewards. Reinforcement learning relied on the SNc dopaminergic signal that measured trial-to-trial changes in the reward value, defined by performance errors.

Using this model, we simulated several learning tasks in the conditions of different unexpected perturbations. When a perturbation was introduced, the model was capable of quickly switching away from pre-learned associations and learning novel cue-action associations. The analysis of the model reveals several features, that can have general importance for brain control of movements: (1) potentiation of the cue-NOGO projections is crucial for quick destruction of preexisting cue-action associations; (2) the synaptic scaling (the decay of the cortical-striatal synaptic weights in the absence of dopamine-mediated potentiation/depression) has a relatively short time-scale (10–20 trials); (3) quick learning is associated with a relatively poor accuracy of the resultant movement. We suggest that BG may be involved in a quick search for behavioral alternatives when the conditions change, but not in the learning of skilled movements that require good precision.

P17 Axon guidance: modeling axonal growth in T-junction assay

Csaba Forro1, Harald Dermutz1,László Demkó1, János Vörös1

1LBB, ETH Zürich, Zürich, 8051, Switzerland

Correspondence: Csaba Forro - forro@biomed.ee.ethz.ch

BMC Neuroscience 2016, 17(Suppl 1):P17

The current field of neuroscience investigates the brain at scales varying from the whole organ, to brain slices and down to the single cell level. The technological advances miniaturization of electrode arrays has enabled the investigation of neural networks comprising several neurons by recording electrical activity from every individual cell in the network. This level of complexity is key in the study of the core principles at play in the machinery of the brain. Indeed, it is the first layer of complexity above the single cell that is still tractable for the human scientist without needing to resort to a ‘Big Data’ approach. In light of this, we strive to create topologically well-defined neural networks, akin to mathematical directed graphs, as a model systems in order to study the basic mechanisms emerging in networks of increasing complexity and varying topology. This approach will also yield statistically sound and reproducible observations, something which is sought after in neuroscience [1].

The first step in realizing such a well-defined neural network is to reliably control the guidance of individual axons in order to connect the network of cells in a controlled way. For this purpose, we present a method consisting of obstacles forcing the axon to turn one way or the other. The setup is made of PolyDiMethylSiloxane (PDMS) which is microstructured by ways of state of the art photolithography procedures. Two tunnels of 5 µ height are patterned into a block of 100 µ thick PDMS and connected in the shape of a T-junction (Fig. 16). Primary cortical neurons are inserted via entry holes at the base of the tunnels. The entry angle of the bottom tunnel (“vertical part of the T”) into the junction is varied between 20° (steep entry) and 90° (vertical entry). We observe that the axons prefer to turn towards the smaller angle. We show how this observed angular selectivity in axon guidance can be explained by a simple model and how this principle can be used to create topologically well-defined neural networks (Fig. 16B).
Fig. 16

A The T-junction assay with an entry angle of 20°. The axon is expected to prefer a right-turn at this angle. B A simple model is constructed where the direction of growth of the axon is proportional to area (red) it can explore

  1. 1.

    Button KS, et al. Power failure: why small sample size undermines the reliability of neuroscience. Nat Rev Neurosci. 2013;14(5):365–76.


P19 Transient cell assembly networks encode persistent spatial memories

Yuri Dabaghian1,2, Andrey Babichev1,2

1Department of Neurology Pediatrics, Baylor College of Medicine, Houston, TX 77030, USA; 2Department of Computational and Applied Mathematics, Rice University, Houston, TX, 77005, USA

Correspondence: Yuri Dabaghian - dabaghian@rice.edu

BMC Neuroscience 2016, 17(Suppl 1):P19

The reliability of our memories is nothing short of remarkable. Thousands of neurons die every day, synaptic connections appear and disappear, and the networks formed by these neurons constantly change due to various forms of synaptic plasticity. How can the brain develop a reliable representation of the world, learn and retain memories despite, or perhaps because of, such complex dynamics? Here we consider the specific case of spatial navigation in mammals, which is based on mental representations of their environments—cognitive maps—provided by the network of the hippocampal place cells—neurons that become active only in a particular region of the environment, known as their respective place fields. Experiments suggest that the hippocampal map is fundamentally topological, i.e., more similar to a subway map than to a topographical city map, and hence amenable to analysis by topological methods [1]. By simulating the animal’s exploratory movements through different environments we studied how stable topological features of space get represented by assemblies of simulated neurons operating under a wide range of conditions, including variations in the place cells’ firing rate, the size of the place fields, the number of cells in the population [2,3]. In this work, we use methods from Algebraic Topology to understand how the dynamic connections between hippocampal place cells influence the reliability of spatial learning. We find that although the hippocampal network is highly transient, the overall spatial map encoded by the place cells is stable.

Acknowledgements: The work was supported by the NSF 1422438 grant and by the Houston Bioinformatics Endowment Fund.

  1. 1.

    Dabaghian Y, Brandt VL, Frank LM. Reconceiving the hippocampal map as a topological template. eLife. 2014. doi:10.7554/eLife.03476.

  2. 2.

    Dabaghian Y, Mémoli F, Frank L, Carlsson G. A topological paradigm for hippocampal spatial map formation using persistent homology. PLoS Comput Biol. 2012;8:e1002581.

  3. 3.

    Arai M, Brandt V, Dabaghian Y. The effects of theta precession on spatial learning and simplicial complex dynamics in a topological model of the hippocampal spatial map. PLoS Comput Biol. 2014;10:e1003651.


P20 Theory of population coupling and applications to describe high order correlations in large populations of interacting neurons

Haiping Huang1

1RIKEN Brain Science Institute, Wako-shi, Saitama, Japan

Correspondence: Haiping Huang - physhuang@gmail.com

BMC Neuroscience 2016, 17(Suppl 1):P20

Correlations among neurons spiking activities play a prominent role in deciphering the neural code. Various models were proposed to understand the pairwise correlations in the population activity. Modeling these correlations sheds light on the functional organization of the nervous system. In this study, we interpret correlations in terms of population coupling, a concept recently proposed to understand the multi-neuron firing patterns of the visual cortex of mouse and monkey [1]. We generalize the population coupling to its higher order (PC2), characterizing the relationship of pairwise firing with the population activity. We derive the practical dimensionality reduction method for extracting the low dimensional representation parameters, and test our method on different types of neural data, including ganglion cells in the salamander retina onto which a repeated natural movie was projected [2], and layer 2/3 as well as layer 5 cortical cells in the medial prefrontal cortex (MPC) of behaving rats [3].

For the retinal data, by considering the correlation between the pairwise firing activity and the global population activity, i.e., the second order population coupling, the three-cell correlation could be predicted partially (64.44 %), which suggests that PC2 acts as a key circuit variable for third order correlations. The interaction matrix revealed here may be related to the found overlapping modular structure of retinal neuron interactions [4]. In this structure, neurons interact locally with their adjacent neurons, and in particular this feature is scalable and applicable for larger networks.

About 94.79 % of three-cell correlations are explained by PC2 in the MPC circuit. The PC2 matrix shows clear hubs’ structure in the cortical circuit. Some neuron interacts strongly with a large portion of neurons in the population, and such neurons may play a key role in shaping the collective spiking behavior during the working memory task. The hubs and non-local effects are consistent with findings reported in the original experimental paper [3].

Acknowledgements: We are grateful to Shigeyoshi Fujisawa and Michael J Berry for sharing us the cortical and retinal data, respectively. We also thank Hideaki Shimazaki and Taro Toyoizumi for stimulating discussions. This work was supported by the program for Brain Mapping by Integrated Neurotechnologies for Disease Studies (Brain/MINDS) from Japan Agency for Medical Research and development, AMED.

  1. 1.

    Okun M, Steinmetz NA, Cossell L, Iacaruso MF, Ko H, Bartho P, et al. Diverse coupling of neurons to populations in sensory cortex. Nature. 2015;521:511–15.

  2. 2.

    Tkacik G, Marre O, Amodei D, Schneidman E, Bialek W, Berry II MJB. Searching for collective behavior in a large network of sensory neurons. PLoS Comput Biol. 2014;10:e1003408.

  3. 3.

    Fujisawa S, Amarasingham A, Harrison MT, Buzsaki G. Behavior-dependent short-term assembly dynamics in the medial prefrontal cortex. Nat Neurosci. 2008;11:823–33.

  4. 4.

    Ganmor E, Segev R, Schneidman E. The architecture of functional interaction networks in the retina. J Neurosci. 2011;31(8):3044–54.


P21 Design of biologically-realistic simulations for motor control

Sergio Verduzco-Flores1

1Computational Neuroscience Unit, Okinawa Institute of Science and Technology, Okinawa 1919-1, Japan

Correspondence: Sergio Verduzco-Flores - sergio.verduzco@oist.jp

BMC Neuroscience 2016, 17(Suppl 1):P21

Several computational models of motor control, although apparently feasible, fail when simulated in 3-dimensional space with redundant manipulators [1, 2]. Moreover, it has become apparent that the details of musculoskeletal simulations, such as the muscle model used, can fundamentally affect the conclusions of a computational study [3].

There would be great benefits from being able to test theories involving motor control within a simulation framework that brings realism in the musculoskeletal model, and in the networks that control movements. In particular, it would be desirable to have: (1) a musculoskeletal model considered to be research-grade within the biomechanics community, (2) afferent information provided by standard models of the spindle afferent and the Golgi tendon organ, (3) muscle stimulation provided by a spiking neural network that follows the basic known properties of the spinal cord, and (4) a cerebellar network as part of adaptive learning.

Creating this type of model is only now becoming practical, not only due to faster computers, but due to properly validated musculoskeletal models and simulation platforms from the biomechanics community, as well as mature software and simulations techniques from the computational neuroscience community. We show how these can be harnessed in order to create simulations that are grounded both by physics and by neural implementation. This pairing of computational neuroscience and biomechanics is sure to bring further insights into the workings of the central nervous system.

  1. 1.

    Gielen S. Review of models for the generation of multi-joint movements in 3D. In: Sternad D, editor. Progress in motor control. New-York: Springer; 2009.

  2. 2.

    Verduzco-Flores SO, O’Reilly RC. How the credit assignment problems in motor control could be solved after the cerebellum predicts increases in error. Front Comput Neurosci. 2015;9:39.

  3. 3.

    Gribble PL, Ostry DJ, Sanguineti V, Laboissière R. Are complex control signals required for human arm movement? J Neurophysiol. 1998;79:1409–24.


P22 Towards understanding the functional impact of the behavioural variability of neurons

Filipa Dos Santos1, Peter Andras1

1School of Computing and Mathematics, Keele University, Newcastle-under-Lyme, ST5 5BG, UK

Correspondence: Filipa Dos Santos - f.d.s.brandao@keele.ac.uk

BMC Neuroscience 2016, 17(Suppl 1):P22

The same neuron may play different functional roles in the neural circuits to which it belongs. For example, neurons in the Tritonia pedal ganglia may participate in variable phases of the swim motor rhythms [1]. While such neuronal functional variability is likely to play a major role the delivery of the functionality of neural systems, it is difficult to study it in most nervous systems. We work on the pyloric rhythm network of the crustacean stomatogastric ganglion (STG) [2]. Typically network models of the STG treat neurons of the same functional type as a single model neuron (e.g. PD neurons), assuming the same conductance parameters for these neurons and implying their synchronous firing [3, 4]. However, simultaneous recording of PD neurons shows differences between the timings of spikes of these neurons. This may indicate functional variability of these neurons. Here we modelled separately the two PD neurons of the STG in a multi-neuron model of the pyloric network. Our neuron models comply with known correlations between conductance parameters of ionic currents. Our results reproduce the experimental finding of increasing spike time distance between spikes originating from the two model PD neurons during their synchronised burst phase. The PD neuron with the larger calcium conductance generates its spikes before the other PD neuron. Larger potassium conductance values in the follower neuron imply longer delays between spikes, see Fig. 17.
Fig. 17

The time distances between the first and second spikes of the simulated PD neurons as a function of the gK and gCaT conductances of the neuron with variable conductances. A first spikes. B Second spikes. The PD neuron with fixed conductances had gK = 1.5768 μS and gCaT = 0.0225 μS

Neuromodulators change the conductance parameters of neurons and maintain the ratios of these parameters [5]. Our results show that such changes may shift the individual contribution of two PD neurons to the PD-phase of the pyloric rhythm altering their functionality within this rhythm. Our work paves the way towards an accessible experimental and computational framework for the analysis of the mechanisms and impact of functional variability of neurons within the neural circuits to which they belong.

  1. 1.

    Hill ES, Vasireddi SK, Bruno AM, Wang J, Frost WN. Variable neuronal participation in stereotypic motor programs. PLoS One. 2012;7:1–11.

  2. 2.

    Bucher D, Johnson CD, Marder E. Neuronal morphology and neuropil structure in the stomatogastric ganglion. J Comp Neurol. 2007;501:185–205.

  3. 3.

    Soto-Treviño C, Rabbah P, Marder E, Nadim F. Computational model of electrically coupled, intrinsically distinct pacemaker neurons. J Neurophysiol. 2005;94:590–604.

  4. 4.

    Golowasch J, Casey M, Abbott LF, Marder E. Network stability from activity-dependent regulation of neuronal conductances. Neural Comput. 1999;11:1079–96.

  5. 5.

    Khorkova O, Golowasch J. Neuromodulators, not activity, control coordinated expression of ionic currents. J Neurosci. 2007;27:8709–18.


P23 Different oscillatory dynamics underlying gamma entrainment deficits in schizophrenia

Christoph Metzner1, Achim Schweikard2, Bartosz Zurowski3

1Science and Technology Research Institute, University of Hertfordshire, Hatfield, United Kingdom; 2Institute for Robotics and Cognitive Systems, University of Luebeck, Luebeck, Germany; 3Department of Psychiatry, University of Luebeck, Schleswig–Holstein, Luebeck, Germany

Correspondence: Filipa Dos Santos - c.metzner@herts.ac.uk

BMC Neuroscience 2016, 17(Suppl 1):P23

In recent years, a significant amount of biomarkers and endophenotypic signatures of psychiatric illnesses have been identified, however, only a very limited number of computational models in support thereof have been described so far [1]. Furthermore, the few existing computational models typically only investigate one possible mechanism in isolation, disregarding the potential multifactoriality of the network behaviour [2]. Here we describe a computational instantiation of an endophenotypic finding for schizophrenia, an impairment in gamma entrainment in auditory click paradigms [3].

We used a model of primary auditory cortex from Beeman [4] and simulated a click entrainment paradigm with stimulation at 40 Hz, to investigate gamma entrainment deficits, and at 30 Hz as a control condition. We explored the multifactoriality by performing an extensive parameter search (approx. 4000 simulations). We focused on synaptic and connectivity parameters of the fast spiking inhibitory interneurons in the model (i.e. number and strength of and, GABAergic decay times at I-to-E and I-to-I connections, independently). We performed a time–frequency analysis of simulated EEG signals and extracted the power in the 40 Hz and the 30 Hz band, respectively. Using the power in the 40 Hz band for 40 Hz stimulation we identified regions in the parameter space showing strong reductions in gamma entrainment. For these we calculated cycle-averaged EEG signals and spike time histograms of both network populations, in order to explore the dynamics underlying the reduction in gamma power.

We find three regions in the parameter space which show strong reductions in gamma power. These three regions, however, have very different parameter settings and show very different oscillatory dynamics. The first, which produces the strongest reduction, is characterised by a strong prolongation of decay times at I-to-E synapses and strong and numerous I-to-E connections. Cycle-averaged spike histograms show a broadening of distributions which indicate that the overall synchrony is reduced, leading to the strong reduction in gamma power. However, this parameter setting also produced a strong reduction of power in the 30 Hz control condition, which is not seen experimentally. The second region, is characterized by prolonged I-to-I decay times together with numerous and strong I-to-I connectivity. Here, a second peak appears in the cycle-average spike histogram of the excitatory population, which leads to a loss of synchrony and thus a reduction in gamma power. The third parameter region, is also characterized by prolonged I-to-I decay times. Moreover, it is associated with a reduction in I-to-I connection numbers and strengths together with strong I-to-E connections. Here, we found that in every second cycle, the spike histogram of the inhibitory neurons showed two peaks, one at the beginning and one in the middle of the cycle. This second peak then inhibited the excitatory neurons’ response to the next stimulation. Hence, the EEG signal showed beat-skipping, i.e. every second gamma peak was suppressed, resulting in a decrease in gamma power.

Performing an extensive parameter search in an in silico instantiation of an endophenotypic finding for schizophrenia, we have identified distinct regions of the parameter space that give rise to analogous network level behaviour found in schizophrenic patients using electrophysiology [3]. However, the oscillatory dynamics underlying this behaviour substantially differ across regions. These regions might correspond to different subtypes of schizophrenic patients and hence, subtypes of what might have different targets for alleviating the deficits because of their differences in underlying dynamics.

  1. 1.

    Siekmeier P. Computational modeling of psychiatric illnesses via well-defined neurophysiological and neurocognitive biomarkers. Neurosci Biobehav Rev. 2015;57:365–80.

  2. 2.

    Pavão, R, Tort ABL, Amaral OB. Multifactoriality in psychiatric disorders: a computational study of schizophrenia. Schizophrenia Bull. 2015;41(4):980–88.

  3. 3.

    Kwon JS, O’Donnell BF, Wallenstein GV, Greene RW, Hirayasu Y, Nestor PG, Hasselmo ME, Potts GF, Shenton ME, McCarley RW. Gamma frequency-range abnormalities to auditory stimulation in schizophrenia. Arch Gen Psychiatry. 1999;56(11):1001–5.

  4. 4.

    Beeman D. A modeling study of cortical waves in primary auditory cortex. BMC Neurosci. 2013;14(Suppl. 1):P23.


P24 Memory recall and spike frequency adaptation

James P. Roach1, Leonard M. Sander2,3, Michal R. Zochowski2,3,4

1Neuroscience Graduate Program, University of Michigan, Ann Arbor, MI 48109, USA; 2Center for the Study of Complex Systems, University of Michigan, Ann Arbor, MI 48109, USA; 3Department of Physics, University of Michigan, Ann Arbor, MI 48109, USA; 4Biophysics Program, University of Michigan, Ann Arbor, MI 48109, USA

Correspondence: James P. Roach - roachjp@umich.edu

BMC Neuroscience 2016, 17(Suppl 1):P24

In the brain, representations of the external world are encoded by patterns of neural activity. It is critical that representations be stable, but still easily moved between. This phenomenon has been modeled at the network level as auto associative memory. In auto associative network models, such as the Hopfield network, representations, or memories, are stored within synaptic weights and form stable fixed points, or attractors [1]. Spike frequency adaptation (SFA) provides a biologically plausible mechanism for switching between stabile fixed points in the Hopfield network. In the present work we show that for low levels of SFA networks will stabilize in a representation that corresponds to the nearest memory activity space, regardless of strength. In networks with higher levels of SFA only the pattern corresponding to the strongest memory, or a global minimum in activity space. The effects of SFA are similar to fast, or thermodynamic noise, but also allows for deterministic destabilization of memories leading to periodic activation of memories through time. We argue that control of SFA level is a universal mechanism for network-wide attractor selectivity. SFA is tightly regulated by the neurotransmitter acetylcholine (ACh) and can be changed on behaviorally relevant timescales. To support this claim we demonstrate that SFA controls selectivity of spatial attractors in a biophysical model of cholinergic modulation in cortical networks [2, 3]. This model produces localized bumps of firing. A region with enhanced recurrent excitation acts as an attractor for the bump location and selectivity for these regions is quickly diminishes as SFA level increases [3]. When multiple spatial attractors of varying strengths are stored in a network moderate increases SFA level will lead to the weak attractors being destabilized and activity localizing within the strongest attractor. This effect is qualitatively similar to the effects of SFA in the Hopfield network. These results indicate that ACh controls memory recall and perception within the cortex by regulation of SFA and explain the important role cholinergic modulation plays in cognitive functions such as attention and memory consolidation [4].

Acknowledgements: JPR was supported by an NSF Graduate Research Fel- lowship Program under Grant No. DGE 1256260 and a UM Rackham Merit Fellowship. MRZ and LMS were supported by NSF PoLS 1058034.

  1. 1.

    Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. PNAS. 1982;79: 2554–8.

  2. 2.

    Stiefel KM, Gutkin BS, Sejnowski TJ. The effects of cholinergic neuromodulation on neuronal phase-response curves of modeled cortical neurons. J Comp Neurosci. 2008;26:289–301.

  3. 3.

    Roach JP, Ben-Jacob E, Sander LM, Zochowski MR. Formation and dynamics of waves in a cortical model of cholinergic modulation. PLoS Comput Biol. 2015;11(8): e1004449.

  4. 4.

    Hasselmo ME, Sarter M. Modes and models of forebrain cholinergic neuromodulation of cognition. Neuropsychopharmacology. 2011;36:52–73.


P25 Stability of neural networks and memory consolidation preferentially occur near criticality

Quinton M. Skilling1, Nicolette Ognjanovski2, Sara J. Aton2, Michal Zochowski1,3

1Biophysics Program, University of Michigan, Ann Arbor, MI 48109 USA; 2Department of Molecular, Cellular, and Developmental Biology, University of Michigan, Ann Arbor, MI, 48109 USA; 3Department of Physics, University of Michigan, Ann Arbor, MI 48109 USA

Correspondence: Michal Zochowski - michalz@umich.edu

BMC Neuroscience 2016, 17(Suppl 1):P25

Dynamic neural representations underlie cognitive processing and are an outcome of complex interactions of network structural properties and cellular dynamics. We have developed a new framework to study dynamics of network representations during rapid memory formation in the hippocampus in response to contextual fear conditioning (CFC) [1]. Experimentally, this memory paradigm is achieved by exposing mice to foot shocks while in a novel environment and later testing for behavioral responses when reintroduced to that environment. We employ the average minimum distance (AMD) functional connectivity algorithm to spiking data recorded before, during, and after CFC using implanted stereotrodes. Comparing changes in functional connectivity using cosine similarity, we find that stable functional representations correlate well with animal performance in learning. Using extensive computer simulations, we show that the most robust changes compared to baseline occur when the system resides near criticality. We attribute these results to emergence of long-range correlations during the initial process of memory formation. Furthermore, we have developed a generic model using a generalized Hopfield framework to link formation of novel memory representation to functional stability changes. The network initially stores a single representation, which is to exemplify biologically already stored (old) memories, and is then presented a new representation by freezing a randomly chosen fraction of nodes from a novel pattern. We show that imposing fractional input of the new representation may partially stabilize this representation near the phase transition (critical) point. We further show that invoking synaptic plasticity rules may fully stabilize this new representation only when the dynamics of the network reside near criticality. Taken together these results show, for the first time, that only when the network is at criticality can it stabilize novel memory representations, the dynamical regime which also yields an increase of network stability. Furthermore, our results match well experimental data observed from CFC experiments.

  1. 1.

    Ognjanovski N, Maruyama D, Lashner N, Zochowski M, Aton SJ. CA1 hippocampal network activity changes during sleep-dependent memory consolidation. Front Syst Neurosci. 2014;8:61.


P26 Stochastic oscillation in self-organized critical states of small systems: sensitive resting state in neural systems

Sheng-Jun Wang1,2, Guang Ouyang2, Jing Guang3, Mingsha Zhang3, K. Y. Michael Wong4, Changsong Zhou2,5,6

1Department of Physics, Shaanxi Normal University, Xi’An City, ShaanXi Province, China; 2Department of Physics and Centre for Nonlinear Studies, Institute of Computational and Theoretical Studies, Hong Kong Baptist University, Kowloon Tong, Hong Kong; 3State Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University, Beijing, China; 4Department of Physics, Hong Kong University of Science and Technology, Clear Water Bay, Hong Kong; 5Beijing Computational Science Research Center, Beijing 100084, People’s Republic of China; 6Research Centre, HKBU Institute of Research and Continuing Education, Shenzhen, China

Correspondence: Changsong Zhou - cszhou@hkbu.edu.hk

BMC Neuroscience 2016, 17(Suppl 1):P26

Self-organized critical states (SOCs) and stochastic oscillations (SOs) are simultaneously observed in neural systems [1], which appears to be theoretically contradictory since SOCs are characterized by scale-free avalanche sizes but oscillations indicate typical scales. Here, we show that SOs can emerge in SOCs of small size systems due to temporal correlation between large avalanches at the finite-size cutoff, resulting from the accumulation-release process in SOCs. In contrast, the critical branching process without accumulation-release dynamics cannot exhibit oscillations. The reconciliation of SOCs and SOs is demonstrated both in the sandpile model and robustly in biologically plausible neuronal networks. The oscillations can be suppressed if external inputs eliminate the prominent slow accumulation process, providing a potential explanation of the widely studied Berger effect or event-related desynchronization in neural response. The features of neural oscillations and suppression are confirmed during task processing in monkey eye-movement experiments. Our results suggest that finite-size, columnar neural circuits may play an important role in generating neural oscillations around the critical states, potentially enabling functional advantages of both SOCs and oscillations for sensitive response to transient stimuli. The results have been published in [2].

Acknowledgements: This work was partially supported by Hong Kong Baptist University Strategic Development Fund, NSFCRGC Joint Research Scheme HKUST/NSFC/12-13/01 (or N-HKUST 606/12), RGC (Grants No. 604512, No. 605813, and No. 12302914), NSFC (Grants No.11275027, No. 11328501, and No. 11305098), and the Fundamental Research Funds for the Central Universities (Grant No. GK201302008).

  1. 1.

    Gireesh E, Plenz D, Neuronal avalanches organize as nested theta-and beta/gamma-oscillations during development of cortical layer 2/3. Proc Natl Acad Sci USA. 2008;105:7576–81.

  2. 2.

    Wang SJ, Ouyang G, Guang J, Zhang MS, Wong KYM, Zhou CS. Stochastic oscillation in self-organized critical states of small systems: sensitive resting state in neural systems. Phys Rev Lett. 2016;116:018101.


P27 Neurofield: a C++ library for fast simulation of 2D neural field models

Peter A. Robinson1,2, Paula Sanz-Leon1,2, Peter M. Drysdale1,2, Felix Fung1,2, Romesh G. Abeysuriya3, Chris J. Rennie1,2, Xuelong Zhao1,2

1School of Physics, University of Sydney, Sydney, New South Wales, 2006, Australia; 2Center for Integrative Brain Function, University of Sydney, Sydney, New South Wales, 2006, Australia

Correspondence: Paula Sanz-Leon - paula.sanz-leon@sydney.edu.au

BMC Neuroscience 2016, 17(Suppl 1):P27

Neural field theory [1] has addressed numerous questions regarding brain dynamics and its interactions across many scales, becoming a highly flexible and unified framework for the study and prediction experimental observables of the electrical activity of the brain. These include EEG spectra [2, 3], evoked response potentials, age-related changes to the physiology of the brain [4], epileptic seizures [5, 6], and synaptic plasticity phenomena [7]. However, numerical simulations of neural field models are not widely available despite their extreme usefulness in cases where analytic solutions are less tractable. This work introduces the features of NeuroField, a research-ready library applicable to simulate a wide range of neural field based systems involving multiple structures (e.g., cortex, cortex and thalamic nuclei, and basal ganglia). The link between a given neural field model, its mathematical representation (i.e., a delay-partial differential equations system with spatial periodic boundary conditions) and its computational implementation is described. The resulting computational model has the capability to represent from spatially extended to neural-mass-like systems, and it has been extensively validated against analytical solutions and against experiment [1–10]. To illustrate its flexibility, a range of simulations modeling a variety of arousal-, sleep- and epilepsy-state phenomena is presented [8, 9]. NeuroField has been written using object-oriented programming in C++ and is bundled together with MATLAB routines for quantitative offline analysis, such as spectral and dynamic spectral analysis.

  1. 1.

    Robinson PA, Rennie CJ, Wright JJ. Propagation and stability of waves of electrical activity in the cortex. Phys Rev E. 1997;56:826–40.

  2. 2.

    Robinson PA, Rennie CJ, Wright JJ, Bahramali H, Gordon E, Rowe D. Prediction of electroencephalographic spectra from neurophysiology. Phys Rev E. 2001;63:021903.

  3. 3.

    Robinson PA, Rennie CJ, Rowe DL, O’Connor SC. Estimation of multiscale neurophysiologic parameters by electroencephalographic means. Hum Brain Mapp. 2004;23:53–72.

  4. 4.

    van Albada SJ, Kerr CC, Chiang AKI, Rennie CJ, Robinson PA. Neurophysiological changes with age probed by inverse modeling of EEG spectra. Clin Neurophysiol. 2010;121:21–38.

  5. 5.

    Robinson PA, Rennie CJ, Rowe DL. Dynamics of large-scale brain activity in normal arousal states and epileptic seizures. Phys Rev E. 2002; 65:041924.

  6. 6.

    Breakspear M, Roberts JA, Terry JR, Rodrigues S, Mahant N, Robinson PA. A unifying explanation of primary generalized seizures through nonlinear brain modeling and bifurcation analysis. Cereb Cortex. 2006;16:1296–1313.

  7. 7.

    Fung PK, Haber AL, Robinson PA. Neural field theory of large-scale synaptic plasticity in the cerebral cortex. J Theor Biol. 2013; 318:44–57.

  8. 8.

    Abeysuriya RG, Rennie CJ, Robinson PA. Physiologically based arousal state estimation and dynamics. J Neurosci Methods. 2015; 253:55–69.

  9. 9.

    Robinson, PA, Postnova, S, Abeysuriya, RG, Kim, JK, Roberts, JA, McKenzie-Sell L, Karanjai, A, Kerr, CC, Fung, F, Anderson, R, Breakspear, MJ, Drysdale, PM, Fulcher, BD, Phillips, AKJ, Rennie, CJ, Yin G. Chapter 5: a multiscale “working brain” model. In: Validating neurocomputational models of of neurological and psychiatric disorders. Paris: Springer; 2015.

  10. 10.

    O’Connor SC, Robinson PA. Spatially uniform and nonuniform analysis of electroencephalographic dynamics, with application to the topography of the alpha rhythm. Phys Rev E. 2004;70:110–9.


P28 Action-based grounding: Beyond encoding/decoding in neural code

Yoonsuck Choe1, Huei-Fang Yang2

1Department of Computer Science & Engineering, Texas A&M University, College Station, TX, 77845, USA; 2Research Center for Information Technology Innovation, Academia Sinica, Taipei, Taiwan

Correspondence: Yoonsuck Choe - choe@tamu.edu

BMC Neuroscience 2016, 17(Suppl 1):P28

How can we decode the neural activation patterns (Fig. 18A)? This is a key question in neuroscience. We as scientists have the luxury of controlling the stimulus, based on which we can find the meaning of the spikes (Fig. 18C-right). However, as shown in Fig. 18A (and C-left), the problem seems intractable from the point of view of the brain itself since neurons deeply embedded in the brain do not have direct access to the stimulus. In [1] and related work, we showed that the decoding problem seems intractable only because we left out the motor system from the picture. Figure 18D shows how motor action can help processes deeply embedded in the brain can understand the meaning of the spikes by generating motor behavior and observing the resulting change in the neural spikes. Here, a key principle is to generate motion that keeps the neural spike pattern invariant over time (Fig. 18E), which allows the following to coincide (1) the property of the motion (diagonal movement) and (2) the encoded property of the input (45° orientation). Using reinforcement learning, we showed that the invariance criterion leads to near optimal state-action mapping for synthetic and natural image inputs (Fig. 18F, G), where the encoded property of the input is mapped to congruent motor action. Furthermore, we showed that the receptive fields can be learned simultaneously with the state-action mapping (Fig. 18H). The main lesson we learned is that the encoding/decoding framework in neural code can lead to a dead end unless the problem is posed from the perspective of the brain itself; and the motor system can play an important role in the shaping of the sensory/perceptual primitives (also see [2]).
Fig. 18

Concept (AE) and simulation results (FH). A Four activities without any clear meaning. b Activities in A are V1 response to oriented lines. C Comparison of brain’s view of spikes (left; apparently intractable) and scientist’s view of spikes (right; decoding possible). D Visuomotor agent set up. E Invariance principle. F Ideal state(s)-action(a) mapping R(s, a) (a), learned R(s, a) (b: synthetic input), learned R(s, a) (c: natural input). G Input (a), initial gaze trajectory (b), and learned gaze trajectory (c). H Learned state-action mapping (a: unordered; b: reordered rows), and learned receptive fields (c: unordered; d: reordered as b) [1]

  1. 1.

    Choe Y, Yang HF, Misra N. Motor system’s role in grounding, receptive field development, and shape recognition. In: 7th IEEE international conference on development and learning (ICDL 2008). IEEE. p. 67–72.

  2. 2.

    Salinas E. How behavioral constraints may determine optimal sensory representations. PLoS Biol. 2006;4(12):e387.


P29 Neural computation in a dynamical system with multiple time scales

Yuanyuan Mi1,†, Xiaohan Lin1,†, Si Wu1

1State Key Lab of Cognitive Neuroscience & Learning, IDG/McGovern Institute for Brain Research, Beijing Normal University, Beijing 100875, China

Correspondence: Si Wu - wusi@bnu.edu.cn

 Y.M. and X.L. contributed equally to this work

BMC Neuroscience 2016, 17(Suppl 1):P29

The brain performs computation by updating its internal states in response to external inputs. Neurons, synapses, and the circuits are the fundamental units for implementing brain functions. At the single neuron level, a neuron integrates synaptic inputs and generates spikes if its membrane potential crosses the threshold. At the synapse level, neurons interact with each other to enhance or depress their responses. At the network level, the topology of neuronal connection pattern shapes the overall population activity. These fundamental computation units of different levels encompass rich short-term dynamics, for example, spike-frequency adaptation (SFA) at single neurons [1], short-term facilitation (STF) and depression (STD) at neuronal synapses [2]. These dynamical features typically expand a broad range of time scale and exhibit large diversity in different brain regions. Although they play a vital part in the rise of various brain functions, it remains unclear what is the computational benefit for the brain to have such variability in short-term dynamics.

In this study, we propose that one benefit for having multiple dynamical features with varied time scales is that the brain can fully exploit the advantages of these features to implement which are otherwise contradictory computational tasks. To demonstrate this idea, we consider STF, SFA and STD with increasing time constants in the dynamics of a CANN. The potential brain regions with these parameter values are the sensory cortex, where the neuronal synapses are known to be STD-dominating. We show that the network is able to implement three seemingly contradictory computations, which are persistent activity, adaptation and anticipative tracking (see Fig. 19). Simply state, the role of STF is to hold persistent activity in the absence of external drive, the role of SFA is to support anticipative tracking for a moving input, and the role of STD is to eventually suppress neural activity for a static or transient input. Notably, the time constants of SFA and STD can be swapped with each other, since SFA and STD have the similar effects on the network dynamics. Nevertheless, we need to include both of them, since a single negative feedback modulation is unable to achieve both anticipative tracking and plateau decay concurrently. The implementation of each individual computational task based on a single dynamical feature has been studied previously. Here, our contribution is on revealing that these tasks can be realized concurrently in a single neural circuit by combined dynamical features with coordinated time scales. We hope that this study will shed light on our understanding of how the brain orchestrates its rich dynamics at various levels to realize abundant cognitive functions.
Fig. 19

Networks implement different computations. A Persistent activity; network can sustain activity after removing stimulus. B Adaptation; network activity attenuates to background level given continuous stimulus. C Anticipative tracking; D network response leads moving stimulus in a certain speed

  1. 1.

    Benda J, Herz AVM. A universal model for spike-frequency adaptation. Neural Comput. 2003;15(11):2523–64.

  2. 2.

    Markram H, Wang Y, Tsodyks M. Differential signaling via the same axon of neocortical pyramidal neurons. Proc Natl Acad Sci. 1998;95(9):5323–28.


P30 Maximum entropy models for 3D layouts of orientation selectivity

Joscha Liedtke1,2, Manuel Schottdorf1,2, Fred Wolf1,2

1Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany; 2Bernstein Center for Computational Neuroscience, Göttingen, Germany

Correspondence: Joscha Liedtke - joscha@nld.ds.mpg.de, Manuel Schottdorf - manuel@nld.ds.mpg.de

BMC Neuroscience 2016, 17(Suppl 1):P30

The neocortex is composed of 6 different layers. In the primary visual cortex (V1), the functional architecture of basic stimulus selectivity is experimentally found to be similar across these layers [1]. The organization in functional columns justifies the use of cortical models describing only two-dimensional layers and disregarding functional organization in the third dimension.

Here we show theoretically that already small deviations from an exact columnar organization can lead to non-trivial three-dimensional functional structures (see Fig. 20). Previously, two-dimensional orientation domains were modeled by Gaussian random fields, the maximum entropy ensemble, allowing for an exact calculation of pinwheel densities [2]. Pinwheels are points surrounded by neurons preferring all possible orientations and these points generalize to pinwheel strings in three dimensions. We extend the previous two-dimensional model characterized by its typical scale of orientation domains to a three-dimensional model by keeping the typical scale in each layer and introducing a columnar correlation length. We dissect in detail the three-dimensional functional architecture for flat geometries and for curved gyri-like geometries with different columnar correlation lengths. The model is analyzed analytically complemented by numerical simulations to obtain solutions for its intrinsic statistical parameters. We find that (i) pinwheel strings are generally curved, (ii) for large curvatures closed loops and reconnecting pinwheel strings appear and (iii) for small columnar correlation lengths a novel transition to a rodent-like interspersed organization emerges.
Fig. 20

A Three-dimensional orientation domains with columnar correlation length of Λ. B String singularities of orientation domains in A. Typical scale of cats Λ ≈ 1 mm

This theory extends the work of [2, 3] by adding a columnar dimension and supplements the work of [4] by a rigorous statistical treatment of the three-dimensional functional architecture of V1. Furthermore, the theory sheds light on the required precision of experimental techniques for probing the fine structure of the columnar organization in V1.

  1. 1.

    Hubel DN, Wiesel TN. Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex. J Physiol. 1962;160:106–54.

  2. 2.

    Schnabel M, Kaschube M, Löwel S, Wolf F. Random waves in the brain: symmetries and defect generation in the visual cortex. Eur Phys J Spec Topics. 2007;145(1):137–57.

  3. 3.

    Wolf F, Geisel T. Spontaneous pinwheel annihilation during visual development. Nature. 1998;395:73–8.

  4. 4.

    Tanaka S, Moon CH, Fukuda M, Kim SG: Three-dimensional visual feature representation in the primary visual cortex. Neural Networks 2011, 24(10):1022-1035.


P31 A behavioral assay for probing computations underlying curiosity in rodents

Yoriko Yamamura1, Jeffery R. Wickens1

1Neurobiology Research Unit, Okinawa Institute of Science and Technology, Onna-son, Okinawa, 904-0412, Japan

Correspondence: Yoriko Yamamura - yoriko@oist.jp

BMC Neuroscience 2016, 17(Suppl 1):P31

Curiosity in humans appears to follow an inverted U-shaped function of unpredictability: stimuli that are neither too predictable nor too unpredictable evoke the greatest interest [1]. Rewarding moderate sensory unpredictability is an effective strategy for reinforcing explorations that improve our predictive models of the world [1, 2]. However, the computations and neural circuits underlying this unpredictability-dependence of curiosity remain largely unknown.

A rodent model of curiosity would be useful for elucidating its underlying neural circuitry, because more specific manipulation techniques are available than in humans. It has been shown that mice prefer unpredictable sounds to predictable ones when the sounds are paired with light [3]. However, frequency of stimulus presentation was a potential confound in this study. Furthermore, a more systematic sampling of stimulus unpredictability is necessary to determine whether a rodent analogue of the U-shaped curve indeed exists.

We have devised an operant conditioning paradigm building on [3], using sensory stimuli as “reward” to quantify the rewardingness of various levels of sensory predictability for rats. Rats (Long Evans, male) are placed in a soundproofed chamber with two nosepoke holes. A combination of sound and light stimuli is presented whenever the rat pokes the active hole; no stimulus is associated with the inactive hole (counterbalanced across subjects).

We hypothesize that reward is also a U-shaped function of stimulus unpredictability in rats, and that this is due to a Bayesian precision weighting placing more importance on deviations from reliabile predictions. This departs from previous learning-based accounts [2]. There are five experimental conditions, systematically varied in unpredictability of the sound stimuli (as quantified by entropy H), and a control condition, in which a nosepoke in neither hole has any consequence (Fig. 21). Specifically, the sound stimuli are random sequences of two possible 125-ms sound snippets of equal value to the rat, with their frequencies of occurrence varied across conditions to vary H. Each sequence contains eight such snippets. Across all conditions, the light stimulus simply remains on while the sound is being played; it is added to enhance the rats’ responding to auditory stimuli [3]. We predict that the rats’ active nosepoke responses will be maximally increased at intermediate H (Fig. 21).
Fig. 21

Schematic of the sound stimuli used in all conditions, and the predicted reward for each

In preliminary experiments for conditions 0 and 2 (N = 3 each; three sessions), rats preferred the active hole to the inactive, replicating the earlier results in mice [3]. Moreover, as hypothesized, rats responded more to the active hole in condition 2 (mean = 15.0, SD = 5.32) than in condition 0 (mean = 11.3, SD = 4.05); t(22) = 1.91, p = 0.0345 (one-tailed t test). We note that in mice, most across-condition differences did not emerge until around session 7 [3].

The proposed assay quantifies the rewardingness of sensory unpredictability in rats. By systematically varying the entropy of the sound sequence, we can probe the computations behind the putative unpredictability-driven reward. The assay can furthermore be used to study the effect of pharmacological or genetic manipulations on unpredictability-driven reward, in order to validate mechanistic implementations of such computations.

  1. 1.

    Kidd C, Hayden BY. The psychology and neuroscience of curiosity. Neuron. 2015;88:449–60.

  2. 2.

    Oudeyer PY, Kaplan F. What is intrinsic motivation? A typology of computational approaches. Front Neurorobot. 2007;1:6.

  3. 3.

    Olsen CM, Winder DG. Stimulus dynamics increase the self-administration of compound visual and auditory stimuli. Neurosci Lett. 2012;511:8–11.


P32 Using statistical sampling to balance error function contributions to optimization of conductance-based models

Timothy Rumbell1, Julia Ramsey2, Amy Reyes2, Danel Draguljić2, Patrick R. Hof3, Jennifer Luebke4, Christina M. Weaver2

1Computational Biology Center, IBM Research, Thomas J. Watson Research Center, Yorktown Heights, NY 10598; 2Department of Mathematics, Franklin and Marshall College, Lancaster, PA 17604; 3Fishberg Department of Neuroscience and Friedman Brain Institute, Icahn School of Medicine at Mount Sinai, New York, NY 10029; 4Department of Anatomy and Neurobiology, Boston University School of Medicine, Boston, MA 02118

Correspondence: Christina M. Weaver - christina.weaver@fandm.edu

BMC Neuroscience 2016, 17(Suppl 1):P32

Recently we developed a three-stage optimization method for fitting conductance-based models to data [1]. The method makes novel use of Latin hypercube sampling (LHS), a statistical space-filling design, to determine appropriate weights automatically for various error functions that quantify the difference between empirical target and model output. The method uses differential evolution to fit parameters active in the subthreshold and suprathreshold regimes (below and above action potential threshold). We have applied the method to spatially extended models of layer 3 pyramidal neurons from the prefrontal cortex of adult rhesus monkeys, in which in vitro action potential firing rates are significantly higher in aged versus young animals [2]. Here we validate our optimization method by testing its ability to recover parameters used to generate synthetic target data. Results from the validation fit the voltage traces of the synthetic target data almost exactly (Fig. 22A–C), whether fitting a model with 4 ion channels (10 parameters), or 8 ion channels (23 parameters). The optimized parameter values are either identical to, or nearby, the original target values (Fig. 22D–F), except for a few parameters that were not well constrained by the simulated protocols. Further, our LHS-based scheme for weighting error functions is significantly more efficient at recovering target parameter values than by weighting all error functions equally, or by choosing weights manually. We are now using the method to fit models to data from several young, middle-aged, and aged monkeys. Adding new conductances to the model, and allowing altered channel kinetics in the axon initial segment versus the soma, improves the quality of the model fits to data. We use published results from empirical studies of layer 3 neocortical pyramidal neurons to determine whether the optimized parameter sets are biologically plausible.
Fig. 22

AC Membrane potential of the synthetic target (black), and of randomly chosen members of the final population (colors, overlaid almost exactly), from three validation studies. Optimized 10 and 23 parameters in AC respectively. D–F Parameter values used to generate synthetic data (black lines), and mean ± standard deviation of values recovered in the searches (colored circles), normalized to the range used in the optimization

  1. 1.

    Rumbell T, Draguljić D, Luebke J, Hof P, Weaver CM. Prediction of ion channel parameter differences between groups of young and aged pyramidal neurons using multi-stage compartmental model optimization. BMC Neurosci. 2015;16(Suppl. 1):P282.

  2. 2.

    Chang YM, Rosene DL, Killiany RJ, Mangiamele LA, Luebke JI. Increased action potential firing rates of layer 2/3 pyramidal cells in the prefrontal cortex are significantly related to cognitive performance in aged monkeys. Cereb Cortex. 2005;15(4):409–18.


P33 Exploration and implementation of a self-growing and self-organizing neuron network building algorithm

Hu He1, Xu Yang2, Hailin Ma1, Zhiheng Xu1, Yuzhe Wang1

1Institute of Microelectronics, Tsinghua University, Beijing, 100081, China; 2School of Software, Beijing Institute of Technology, Beijing, 100083, China

Correspondence: Xu Yang - yangxu@tsinghua.edu.cn

BMC Neuroscience 2016, 17(Suppl 1):P33

In this work, an algorithm to build self-growing and self-organizing neuron network according to external signals is presented, in attempt to build neuron network with high intelligence. This algorithm takes a bionic way to build complex neuron network. We begin with very simple external signals to provoke neurons.

In order to propagate the signals, neurons will seek to connect to each other, thus building neuron networks. Those generated networks will be verified and optimized, and be treated as seeds to build more complex networks. Then we repeat this process, use more complex external signals, and build more complex neuron networks. A parallel processing method is presented, to enhance the computation efficiency of the presented algorithm, and to help build large scale of neuron network with reasonable time. The result shows that, neuron network built by our algorithm can self-grow and self-organize as the complexity of the input external signals increase. And with the screening mechanism, neuron network that can identify different input external signals is built successfully (Fig. 23).
Fig. 23

Neuron network generated by our algorithm

Acknowledgements: This work is supported by the Core Electronic Devices, High-End General Purpose Processor, and Fundamental System Software of China under Grant No. 2012ZX01034-001-002, the National Natural Science Foundation of China under Grant No. 61502032, Tsinghua National Laboratory for Information Science and Technology (TNList), and Samsung Tsinghua Joint Laboratory.

P34 Disrupted resting state brain network in obese subjects: a data-driven graph theory analysis

Kwangyeol Baek1,2, Laurel S. Morris1, Prantik Kundu3, Valerie Voon1

1Department of Psychiatry, University of Cambridge, Cambridge, CB2 0QQ, United Kingdom; 2Department of Biomedical Engineering, Ulsan National Institute of Science and Technology, Ulsan, South Korea; 3Departments of Radiology and Psychiatry, Icahn School of Medicine at Mount Sinai, New York City, 10029, USA

Correspondence: Kwangyeol Baek - kb567@cam.ac.uk

BMC Neuroscience 2016, 17(Suppl 1):P34

The efficient organization and communication of brain networks underlies cognitive processing, and disruption in resting state brain network has been implicated in various neuropsychiatric conditions including addiction disorder. However, few studies have focused on whole-brain networks in the maladaptive consumption of natural rewards in obesity and binge-eating disorder (BED). Here we use a novel multi-echo resting state functional MRI (rsfMRI) technique along with a data-driven graph theory approach to assess global and regional network characteristics in obesity and BED.

We collected multi-echo rsfMRI scans from 40 obese subjects (including 20 BED patients) and 40 healthy controls, and used multi-echo independent component analysis (ME-ICA) to remove non-BOLD noise. We estimated the normalized correlation across mean rsfMRI signals in 90 brain regions of the Automated Anatomical Labeling atlas, and computed global and regional network metrics in the binarized connectivity matrix with density threshold of 5–25 %. In addition, we confirmed the observed alterations in network metrics using the Harvard-Oxford atlas which was parcellated into 470 even-sized regions.

Obese subjects exhibited significantly reduced global and local efficiency as well as decreased modularity in the whole-brain network compared to healthy controls (Fig. 24). Both BED patients and the obese subjects without BED exhibited the same alteration of network metrics compared with healthy controls, but two obese groups did not differ from each other. In regional network metrics, bilateral putamen, thalamus and right pallidum exhibited profoundly decreased nodal degree and efficiency in obese subjects, and left superior frontal gyrus showed decreased nodal betweeness in obese subjects (all p < 0.05, Bonferroni correction). Network-based statistics revealed a cortico-striatal/cortico-thalamic network with significantly decreased functional connectivity which consisted of bilateral putamen, pallidum, thalamus, primary motor cortex, primary somatosensory cortex, supplementary motor area, paracentral lobule, superior parietal lobule, superior temporal cortex and left amygdala. Interestingly, when examining the same network properties but using only single-echo rsfMRI data analysis without ME-ICA, we find no significant differences between groups.
Fig. 24

A Disrupted resting state brain network in obese subjects. B Global network properties network-based statistics

Therefore, using data-driven graph theory analysis of multi-echo rsfMRI data, we highlight more subtle impairments in cortico-striatal/cortico-thalamic networks in obesity that have previously been associated with substance addictions. We emphasize global impairments in network efficiency in obesity with disrupted local network organization closer to random networks. Mathematically capturing brain network alterations in obesity provides novel insights into potential biomarkers and therapeutic targets.

P35 Dynamics of cooperative excitatory and inhibitory plasticity

Everton J. Agnes1, Tim P. Vogels1

1Centre for Neural Circuits and Behaviour, University of Oxford, Oxford, OX1 3SR, UK

Correspondence: Everton J. Agnes - everton.agnes@cncb.ox.ac.uk

BMC Neuroscience 2016, 17(Suppl 1):P35

Neurons receive balanced excitatory and inhibitory inputs, a phenomenon thought to be essential for a variety of computations [1–3]. Inhibitory synaptic plasticity is an obvious candidate for imposing this balanced input regime [2,4], leaving excitatory synapses available to learn patterns and memories. Recent experimental work seems to agree with that notion of collaborative excitatory and inhibitory plasticity [4], but recent models do not take direct interactions into consideration. Instead, learning rules are usually tuned to indirectly but constructively interact via the firing-rates they elicit [3,5]. Without proper parameter tuning, this can be problematic because excitatory and inhibitory synaptic plasticity models may have different homeostatic set points, making synaptic weights fluctuate wildly (Fig. 25A, B; green lines). Here we present a hybrid model of inhibitory synaptic plasticity that combines the simplicity of spike-based models with the addition of a excitatory/inhibitory input dependence. It captures recent experimental findings showing that changes at inhibitory synapses are strongly correlated with the balance between excitation and inhibition and that inhibitory synapses do not change when excitatory input is blocked [4]. Essentially, our model is a symmetric spike-timing-dependent plasticity (STDP) rule in which the learning-rate is controlled by excitatory and inhibitory activities—a spike-timing- and current-dependent plasticity (STCDP) model. Balance is maintained, but the learning rule does not impose fixed-point attractor dynamics to post-synaptic neurons, because there is no change in inhibitory synapses once the total input is balanced. Inhibitory synapses change depending on excitatory synapses, which means that plasticity depends on at least three synaptic participants (trisynaptic) instead of only two (bisynaptic). We show that when combined with an excitatory synaptic plasticity model, both excitatory and inhibitory weights converge to stable values, as the firing-rate reaches the fixed-point imposed by the excitatory learning rule (Fig. 25B; yellow lines). More importantly, the learning rule allows efficient and stable learning of new weights when the balance is disrupted, opening the door for effective and stable learning of arbitrary synaptic patterns.
Fig. 25

A Schematics representing the neuronal network. A group of 2000 excitatory neurons and 500 inhibitory neurons are recurrently connected with sparse connectivity and the excitatory neurons receive random input from an external pool of neurons. B Excitatory neurons’ mean firing-rate (top), mean excitatory weight onto excitatory neurons (middle) and mean inhibitory weight onto excitatory neurons (connections marked as plastic in A). Simulation of the neuronal network with a spike-based inhibitory learning rule is represented by green lines (STDP) while simulation with our novel spike-timing- and current-dependent learning rule is shown in yellow (STCDP). The dashed lines represent the fixed points imposed by the excitatory (high) and inhibitory (low) learning rules. The low fixed point only exists for the inhibitory STDP model (simulation represented by the green lines)

Acknowledgements: This work was partially funded by the Brazilian agency CNPq (Grant Agreement Number 235144/2014-2) and the Sir Henry Dale Fellowship (Grant Agreement WT100000).

  1. 1.

    Denève S, Machens CK. Efficient codes and balanced networks. Nat Neurosci. 2016;19:375–85.

  2. 2.

    Vogels TP, Froemke RC, Doyon N, Gilson M, Haas JS, Liu R, Maffei A, Miller P, Wierenga CJ, Woodin MA, et al. Inhibitory synaptic plasticity: spike timing-dependence and putative network function. Front Neural Circuits. 2013;7:119.

  3. 3.

    Zenke F, Agnes EJ, Gerstner W. Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks. Nat Commun. 2015;6:6922.

  4. 4.

    D’amour JA, Froemke RC. Inhibitory and excitatory spike-timing-dependent plasticity in the auditory cortex. Neuron. 2015;86:514–28.

  5. 5.

    Sprekeler H, Clopath C, Vogels TP. Interactions of excitatory and inhibitory synaptic plasticity. Front Comp.


P36 Frequency-dependent oscillatory signal gating in feed-forward networks of integrate-and-fire neurons

William F. Podlaski1, Tim P Vogels1

1Centre for Neural Circuits and Behaviour, University of Oxford, Oxford, UK

Correspondence: William F. Podlaski - william.podlaski@cncb.ox.ac.uk

BMC Neuroscience 2016, 17(Suppl 1):P36

Neural oscillations—the periodic synchronisation of neuronal spiking—is a common feature of brain activity, with several hypothesised functions relating to information flow, attention and brain state [1]. Previous experimental work has shown that oscillatory activity correlates with moments of heightened attention, and that communication between different brain areas is often marked by an increase in oscillatory coherence between the regions [2]. Theoretical and modelling work has helped to explore the mechanisms behind neuronal oscillations, and some of their effects on neural coding and signal propagation [3]. Recently, theoretical studies have explored how resonance might affect signal processing [4, 5] and how information can be propagated along different pathways according to oscillatory phase and frequency [6].

We expand this work here by studying how resonance at the single neuron level might be used for frequency-dependent gating of information flow in neuronal networks. We show that in feed-forward spiking network simulations background oscillations can synchronise or desynchronise the spikes of a propagated signal, changing its content and emphasis from rate code to synfire code or vice versa. Such a mechanism can modulate information flow without rewiring the signal pathways themselves, allowing to select for specific downstream readout targets. Building on this idea, we can create entire pathways that can be selectively (in-)activated by different background oscillatory frequencies without changing the connectivity of the network. We hypothesise that neuronal resonance, combined with resonance in synapses and network motifs, can allow for precise oscillatory gating of information in cortex. Building on previous studies of resonance and oscillatory signal propagation [4,5,6] we propose a plausible mechanism for how fast and precise frequency-dependent gating might be achieved in the brain.

Acknowledgements: Research was supported by a Sir Henry Dale Royal Society and Wellcome Trust Research Fellowship (WT100000).

  1. 1.

    Buzsáki G. Rhythms of the brain. Oxford: Oxford University Press; 2011.

  2. 2.

    Engel AK, Fries P, Singer W. Dynamic predictions: oscillations and synchrony in top-down processing. Nat Rev Neurosci. 2001;2(10):704–16.

  3. 3.

    Wang XJ. Neurophysiological and computational principles of cortical rhythms in cognition. Physiol Rev. 2010;90:1195–1268.

  4. 4.

    Richardson MJE, Brunel N, Hakim V. From subthreshold to firing-rate resonance. J Neurophysiol. 2003;89:2538–54.

  5. 5.

    Hahn G, Bujan AF, Frégnac Y, Aertsen A, Kumar A. Communication through resonance in spiking neuronal networks. PLoS Comput Biol. 2014;10(8):e1003811.

  6. 6.

    Akam T, Kullmann DM. Oscillatory multiplexing of population codes for selective communication in the mammalian brain. Nat Rev Neurosci. 2014;15(2):111–22.


P37 Phenomenological neural model for adaptation of neurons in area IT

Martin Giese1, Pradeep Kuravi2, Rufin Vogels2

1Section Computational Sensomotorics, CIN & HIH, Department of Cognitive Neurology, University Clinic Tübingen, Germany; 2Lab. Neuro en Psychofysiologie, Dept. Neuroscience, KU Leuven, Belgium

Correspondence: Martin Giese - martin.giese@uni-tuebingen.de

BMC Neuroscience 2016, 17(Suppl 1):P37

For repeated stimulation neurons in higher-level visual cortex show adaptation effects. Such effects likely influence repetition suppression paradigms in fMRI studies and the formation of high-level after-effects, e.g. for faces [1]. A variety of theoretical explanations has been discussed, which are difficult to distinguish without detailed electrophysiological data [2]. Meanwhile, detailed physiological experiments on the adaptation of shape-selective neurons in inferotemporal cortex (area IT) have provided constraints that help to narrow down possible neural processes. We propose a neurodynamical model that reproduces a number of these experimental observations by biophysically plausible neural circuits. Our model uses the mean-field limit and consists of a neural field of shape-selective dynamic linear-threshold neurons that are augmented several adaptation processes: (i) spike-rate adaptation; (ii) an input fatigue adaptation process, modeling adaptation in earlier hierarchy levels and of afferent synapses; (iii) a firing-rate fatigue adaptation process that models adaptation dependent on the output firing rates of the neurons. The model with a common parameter set is compared to results from several studies about adaptation in area IT. The model reproduces the following experimentally observed effects: (i) shape of the typical PSTHs of IT neurons; (ii) temporal decay for repeated stimulation of the same neurons with many repetitions of the same stimulus [3] (Fig. 26A); (iii) dependence of adaptation on efficient and ineffective adaptor stimuli, which stimulate the neuron strongly or only moderately [4] (Fig. 26B); (iv) dependence of the strength of the adaptation effect on the duration of the adaptor (Fig. 26C). A mean field model with several additional adaptive processes can account for the observed experimental effects, where all introduced processes were necessary to account for the results. Especially the observed dependence on the effectivity of the adaptor cannot be reproduced without an appropriate mixture if an input fatigue and a firing-rate fatigue mechanism. This suggests that adaptation in IT neurons is significantly influenced by several biophysical processes with different spatial and temporal scales.
Fig. 26

Simulation results. A Decay of neural activity for multiple repetitions of the same stimulus. B Experiment adapting with effective and ineffective stimuli. C Dependence of the PSTH on adaptor duration and unadapted response (black)

Acknowledgements: Supported by EC Fp7-PEOPLE-2011-ITN PITN-GA-011-290011 (ABC), FP7-ICT-2013-FET-F/604102 (HBP), FP7-ICT-2013-10/611909 (Koroibot), BMBF, FKZ: 01GQ1002A, DFG GI 305/4-1 + KA 1258/15-1.

  1. 1.

    Leopold DA, O’Toole AJ, Vetter T, Blanz V: Prototype-referenced shape encoding revealed by high-level aftereffects. Nat Neurosci. 2001;4(1):89–94.

  2. 2.

    Grill-Spector K, Henson R, Martin A. Repetition and the brain: neural models of stimulus-specific effects. Trends Cogn Sci. 2006;10(1):14–23.

  3. 3.

    Sawamura H, Orban GA, Vogels R. Selectivity of neuronal adaptation does not match response selectivity: a single-cell study of the FMRI adaptation paradigm. Neuron. 2006;49(2):307–18.

  4. 4.

    De Baene W, Vogels R. Effects of adaptation on the stimulus selectivity of macaque inferior temporal spiking activity and local field potentials. Cereb Cortex. 2010;20(9):2145–65.


P38 ICGenealogy: towards a common topology of neuronal ion channel function and genealogy in model and experiment

Alexander Seeholzer1,†, William Podlaski2,†, Rajnish Ranjan3, Tim Vogels2

1Laboratory of Computational Neuroscience, EPF Lausanne, Switzerland; 2Centre for Neural Circuits and Behaviour, University of Oxford, UK; 3The Blue Brain Project, EPF Lausanne, Switzerland

Correspondence: Alexander Seeholzer - alex.seeholzer@epfl.ch

These authors contributed equally to this work.

BMC Neuroscience 2016, 17(Suppl 1):P38

Ion channels are fundamental constituents determining the function of single neurons and neuronal circuits. To understand their complex interactions, the field of computational modeling has proven essential: since its emergence, thousands of ion channel models have been created and published as part of detailed neuronal simulations [1]. Faced with this large variety of models, it is difficult to determine how particular models relate to each other, to the interpretability of simulations and, importantly, to experimental data.

Here, we present a framework within which we analyzed a pilot set of 2378 voltage- or calcium-dependent published ion channel models for the NEURON simulator [1]. We extracted annotated metadata from all associated publications, helping identify their use in simulations (e.g. the animal type, neuron type or area of compartmental models) and the provenance of ion channel models as they were derived from other published work. This categorical and relational metadata is combined with quantitative evaluations of all channel models: individual channels are characterized by their responses to voltage clamp protocols. With subsequent cluster analysis, we extract topologies of ion channel similarity and genealogy, identifying redundancy and groups of common channel kinetics.

The result of this large-scale assay of published work is freely accessible through interactive visualizations (see Fig. 27A) on the Ion Channel Genealogy (ICG) web-resource [2], providing a tool for model discovery and comparison. Bridging the gap between model and experiment, our resource allows classifying new channel models and experimental current traces within the topology of all models currently in the database (see Fig. 27B, C). The ICG framework thus allows for quantitative comparison of ion channel kinetics, experimental and model alike, aimed to facilitate field-wide standardization of experimentally-constrained modeling.
Fig. 27

A Visualizations available on the web-resource [2] for model browsing. B Schematic of upload and evaluation. Both experimental current traces and mod files can be uploaded to our servers, where they are scored and compared to all models currently in the database. C Exemplary result of automated comparison: Current traces (recorded from “Ramp” and “Activation” voltage clamp protocols) of the uploaded model (red) together with mean (1st, 2nd, 3rd, 4th) and individual (gray) traces of the four most similar clusters of channel models in the database

Acknowledgements: Research was supported by a Sir Henry Dale Royal Society & Wellcome Trust Research Fellowship (WT100000). A.S. was supported by the Swiss National Science Foundation (200020_147200). R.R. was supported by the EPFL Blue Brain Project Fund and the ETH Board funding to the Blue Brain Project.

  1. 1.

    Hines ML, Morse T, Migliore M, Carnevale NT, Shepherd GM. ModelDB. A database to support computational neuroscience. J Comput Neurosci. 2004;17:7–11.

  2. 2.

    ICGenealogy Project Website. http://icg.neurotheory.ox.ac.uk.


P39 Temporal input discrimination from the interaction between dynamic synapses and neural subthreshold oscillations

Joaquin J. Torres1, Fabiano Baroni2, Roberto Latorre3, Pablo Varona3

1Departamento de Electromagnetismo y Física de la Materia, and Institute “Carlos I” for Theoretical and Computational Physics, University of Granada, Granada, Spain; 2School of Psychological Sciences, Faculty of Biomedical and Psychological Sciences, Monash University, Australia; 3Grupo de Neurocomputación Biológica, Dpto. de Ingeniería Informática, Escuela Politécnica Superior, Universidad Autónoma de Madrid, Spain

Correspondence: Pablo Varona - pablo.varona@uam.es

BMC Neuroscience 2016, 17(Suppl 1):P39

Neuronal subthreshold oscillations underlie key mechanisms of information discrimination in single cells while dynamic synapses provide channel-specific input modulation. Previous studies have shown that intrinsic neuronal properties, in particular subthreshold oscillations, constitute a biophysical mechanism for the emergence of non-trivial single-cell input/output preferences (e.g., preference towards decelerating vs. accelerating input trains of the same average rate) [1, 2]. It has also been shown that short-term synaptic dynamics, in the form of short-term depression and/or short-term facilitation, can provide a channel-specific mechanism for the enhancement of the post-synaptic effects of temporally specific input sequences [3, 4]. While intrinsic oscillations and synaptic dynamics are typically studied independently, it is reasonable to hypothesize that their interplay can lead to more selective and complex temporal input processing.

Here, we extend and refine our previous computational study on the interaction between subthreshold oscillations and synaptic depression [5]. In particular, we investigated whether, and under which conditions, the combination of intrinsic subthreshold oscillations and short-term synaptic dynamics can act synergistically to enable the emergence of robust and channel-specific selectivity in neuronal input–output transformations. We calculated analytically the voltage trajectories and spike output of generalized integrate-and-fire (GIF) model neurons in response to temporally distinct trains of input EPSPs. In particular, we considered triplets of input EPSPs in a range that covers intrinsic and synaptic time scales, and analyzed the model output as intrinsic and synaptic parameters were varied.

Our results show that intrinsic and synaptic dynamics interact in a complex manner for the emergence of specific input–output transformations. In particular, precise non-trivial preferences emerge from synergistic intrinsic and synaptic preferences, while broader selectivity is observed for mismatched intrinsic and synaptic dynamics. We discuss the conditions for robustness of the observed input/output relationships.

We conclude that the interaction of intrinsic and synaptic properties can enable the biophysical implementation of complex and channel-specific mechanisms for the emergence of selective neuronal responses. We further interpret our results in the light of experimental evidence describing distinct short-term synaptic dynamics in different afferents converging onto the same neuron, as in the case of parallel and climbing fiber inputs to cerebellar Purkinje cells, and advance specific hypotheses that link heterogeneous synaptic dynamics of distinct pathways onto the same post-synaptic target to their distinct computational function. We also discuss the impact of single-channel/single-neuron temporal input discrimination in the context of information processing based on heterogeneous elements.

Acknowledgements: We acknowledge support from MINECO FIS2013-43201-P, DPI2015-65833-P, TIN-2012-30883 and ONRG Grant N62909-14-1-N279.

  1. 1.

    Baroni F, Varona P. Subthreshold oscillations and neuronal input–output relationships. Neurocomputing. 2007;70:1611–14.

  2. 2.

    Baroni F, Torres JJ, Varona P. History-dependent excitability as a single-cell substrate of transient memory for information discrimination. PLoS One. 2010;5:e15023.

  3. 3.

    O’Donnell C, Nolan MF. Tuning of synaptic responses: An organizing principle for optimization of neural circuits. Trends Neurosci. 2011;34:51–60.

  4. 4.

    Torres JJ, Kappen HJ. Emerging phenomena in neural networks with dynamic synapses and their computational implications. Front Comp Neurosci. 2013;7.

  5. 5.

    Latorre R, Torres JJ, Varona P. Interplay between subthreshold oscillations and depressing synapses in single neurons. PLoS One. 2016;11:e0145830.


P40 Different roles for transient and sustained activity during active visual processing

Bart Gips1,†, Eric Lowet1,2,†, Mark J Roberts1,2, Peter de Weerd2, Ole Jensen1, Jan van der Eerden1

1Radboud University, Donders Institute for Brain, Cognition and Behaviour, 6525 EN Nijmegen, The Netherlands; 2Faculty of Psychology and Neuroscience, Maastricht University, 6200 MD Maastricht, the Netherlands

Correspondence: Bart Gips - bart.gips@donders.ru.nl

† Authors have made equal contribution

BMC Neuroscience 2016, 17(Suppl 1):P40

Neural activity in awake primate early visual cortex exhibits transients with intervals of 250-300 ms. Experimental work by us and others has shown that these transients are related to microsaccadic eye movements [1, 2]. These short transients are followed by periods of steady activity that last until the next microsaccade (Fig. 28A).
Fig. 28

A Time–frequency representation of local field potential (LFP) locked to a microsaccade (MS) recorded in primate V1. B Time–frequency representation of simulated LFP. C Schematic representation of the model network illustrating input (injection current), recurrent connection pattern and output (spike trains). D The input to the neurons is best reflected in the simulated spike trains (output) during phase I, quantified by mutual information (MI). E Recurrent connection pattern is best reflected in the output during phase II

We found that computational models of excitatory-inhibitory spiking networks organized in a structure of columns and hypercolumns, are able to represent relevant stimulus information when subjected to 3–4 Hz saccade-like transients. The simulated networks expressed evoked responses with power in the alpha–beta band (~8–25 Hz) as well as gamma rhythmic activity (~25–80 Hz) similar to in vivo local field recordings in monkey V1 (Fig. 28A, B).

We show that in phase I, the model produces large-scale spatial synchrony and pronounced alpha–beta power. In phase II the model exhibits narrow-band gamma oscillations with spatially local synchrony. The activity in the model network (rate and timing coding) in phase I mainly reflects feedforward input (Fig. 28C, D), whereas, the network activity in phase II was dominated by recurrent connections (Fig. 28C, E).

The model network activity closely matches that found in experiments. The simulation results suggest that transient phase (phase I) allows for resetting the network and rapid feedfoward processing of novel information, whereas detailed processing and contextualization by recurrent activity take place in the period of steady gamma activity (phase II). Therefore we arrived at hypotheses on the functional interpretation of phases I and II that can be possibly tested in an experimental setup. First, because of the reset of network activity by a microsaccade, phase I is the optimal time window to switch information flow among competing networks through a top-down signal. This indicates that signals related to visual attention are most likely to occur just after a saccade. Second, the increased efficacy of recurrent connections during phase II indicate that contextualization operations such as figure-ground segregation [3] and contour completion occur in the steady phase ~100 ms after the onset of a (micro)saccade.

  1. 1.

    Lowet E, Roberts MJ, Bosman CA, Fries P, de Weerd P. Areas V1 and V2 show microsaccade-related 3–4 Hz covariation in gamma power and frequency. Eur J Neurosci. 2015.

  2. 2.

    Martinez-Conde S, Otero-Millan J, Macknik SL. The impact of microsaccades on vision: towards a unified theory of saccadic function. Nat Rev Neurosci. 2013;14:83–96.

  3. 3.

    Self MW, van Kerkoerle T, Supèr H, Roelfsema PR. Distinct roles of the cortical layers of area V1 in figure-ground segregation. Curr Biol. 2013:1–9.


P41 Scale-free functional networks of 2D Ising model are highly robust against structural defects: neuroscience implications

Abdorreza Goodarzinick1, Mohammad D. Niry1,2, Alireza Valizadeh1,3

1Department of Physics, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan 45137-66731, Iran; 2Center for Research in Climate Change and Global Warming (CRCC), Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan 45137-66731, Iran; 3School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM), Tehran - Iran

Correspondence: Abdorreza Goodarzinick - a.goodarzinick@iasbs.ac.ir

BMC Neuroscience 2016, 17(Suppl 1):P41

In recent years, several experimental observations have confirmed the emergence of self-organized criticality (SOC) in the brain at different scales [1]. At large scale, functional brain networks obtained from fMRI data have shown that node-degree distributions and probability of finding a link versus distance are indicative of scale-free and small-world networks regardless of the tasks in which the subjects were involved [2]. At small scale, the study of neuronal avalanches in networks of living neurons revealed power-law behavior in both spatial and temporal scales [3]. It is also shown that functional networks of the brain are strikingly similar to those derived from the 2D Ising model at critical temperature [4] and the 2D abelian sandpile model [5].

The importance to see whether brain network’s scaling properties associated with healthy conditions are altered under various pathologies and how structural defects of a system at criticality can affect its functional connectivity motivated us to study robustness of functional networks of 2D Ising model at critical point against elimination of structural sites. The results showed that the statistics of the functional network indicative of criticality (evident in healthy brain controls), such as power-law behavior and small-worldness remained robust against random elimination of structural sites up to percolation limit (see Fig. 29). The resulting functional network maintained its key properties orders of magnitude higher than those of the same system poised in a super-critical or sub-critical state. These results can show that self-organized critical behavior, besides having unique advantages like fasciliation of alteration of functional patterns, optimization of information transfer and maximization of correlation length, shows striking robustness against structural deficits. Taking into account brain’s long-range anatomical connections and compensatory mechanisms like neuroplasticity, if the results of this study are generalizable to the brain, they may help to explain the delay in clinical diagnosis of multiple neurodegenerative diseases in which possible deficit in functional connectivity among brain regions contribute to the cognitive dysfunctions.
Fig. 29

Relevant parameters of functional network of 2D Ising model at critical point versus fraction of defect to the structural cells. A Power-law exponent of degree-distribution, B small-worldness measure, C average degree

  1. 1.

    Chialvo DR. Emergent complex neural dynamics. Nat Phys. 2010;6:744–50.

  2. 2.

    Eguíluz VM, Chialvo DR, Cecchi GA, Baliki M, Apkarian AV. Scale-free brain functional networks. Phys Rev Let.t 2005;94:018102.

  3. 3.

    Beggs J, Plenz D. Neuronal avalanches in neocortical circuits. J Neurosci. 2003;23:11167–77.

  4. 4.

    Fraiman D, Balenzuela P, Foss J, Chialvo D. Ising-like dynamics in large-scale functional brain networks. Phys Rev E. 2009;79:061922.

  5. 5.

    Zarepour M, Niry MD, Valizadeh A. Functional scale-free networks in the two-dimensional Abelian sandpile model. Phys Rev E. 2015;92:012822.


P42 High frequency neuron can facilitate propagation of signal in neural networks

Aref Pariz1, Shervin S Parsi1, Alireza Valizadeh1,2

1Department of Physics, Institute for advanced studies in basic sciences, Zanjan, Iran; 2School of Cognitive Sciences, Institute for Studies in Theoretical Physics and Mathematics, Niavaran, Tehran, Iran

Correspondence: Aref Pariz - a.pariz@iasbs.ac.ir

BMC Neuroscience 2016, 17(Suppl 1):P42

Signal transmission is of interest from both fundamental and clinical perspective and has been well studied in nonlinear science and complex networks [1, 2]. In particular, in nervous systems, cognitive processing involves signal propagation through multiple brain regions and the activation of large numbers of specific neurons [3–6]. In information propagation through brain regions, each part, known as generator, activated locally as information comes to it from neighboring generators. Although the problem is well studied in the context of complex networks, our focus here is on the effect of the intrinsic dynamical properties of the reciprocal generators on the propagation of signal.

In this study we explored the propagation of information in a chain of neurons and networks. As signal propagate through the chain of networks, the firing rate of networks show a fluctuation as host network (the network which receive signal). Here the response is the amplitude of fast Fourier transform of firing rates of each network. If the host network has sufficiently higher intrinsic firing rate than others, signal can transfer with higher amplitude, otherwise, other networks will not get affected. As a result of propagation of signal, for the former case, all networks will show a peak in frequency domain at exactly the same frequency as input signal (Fig. 30A), but with different amplitude which show the efficacy of transmitted information. Also the same result can obtain by a chain of single LIF neurons (Fig. 30B). As phase response curve of the chain and it response to signal show, if the host neuron has higher firing rate (call it leader neuron), the propagation of information will be enhanced. But this higher firing rate has a limit which after that the whole chain will act asynchronously and results the loss of information was aimed to propagate.
Fig. 30

Inhomogeneity of input current on host network, increases the response of network. A, B Response of networks of neurons and chain of neurons, for different inhomogeneity on host network and host neuron, respectively

  1. 1.

    Liang X, Liu Z, Li B. Weak signal transmission in complex networks and its application in detecting connectivity. Phys Rev E. 2010;80:046102.

  2. 2.

    Perc M. Stochastic resonance on weakly paced scale-free networks. Phys Rev E. 2008;78:036105.

  3. 3.

    Abeles M. Corticonics: neural circuits of the cerebral cortex. Cambridge: Cambridge UP; 1991.

  4. 4.

    Aertsen A, Diesmann M, Gewaltig MO. Propagation of synchronous spiking activity in feedforward neural networks. J Physiol. 1996;90:243–247.

  5. 5.

    van Rossum MC, Turrigiano GG, Nelson SB. Fast propagation of firing rates through layered networks of noisy neurons. J Neurosci. 2002;22:1956–66.

  6. 6.

    Vogels TP, Abbott LF. Signal propagation and logic gating in networks of integrate-and-fire neurons. J Neurosci. 2005;25(46):10786–95.


P43 Investigating the effect of Alzheimer’s disease related amyloidopathy on gamma oscillations in the CA1 region of the hippocampus

Julia M. Warburton1, Lucia Marucci2, Francesco Tamagnini3,4, Jon Brown3,4, Krasimira Tsaneva-Atanasova5

1Bristol Centre for Complexity Sciences, University of Bristol, Bristol, BS8 1TR, UK; 2Department of Engineering Mathematics, University of Bristol, Bristol, BS8 1UB, UK; 3School of Physiology and Pharmacology, University of Bristol, Bristol, BS8 1TD, UK; 4Medical School, University of Exeter, Exeter, EX4 4PE, UK; 5Department of Mathematics, University of Exeter, Exeter, EX4 4QF, UK

Correspondence: Julia M. Warburton - julia.warburton@bristol.ac.uk

BMC Neuroscience 2016, 17(Suppl 1):P43

Alzheimer’s disease (AD) is the main form of dementia and is characterised clinically by cognitive decline and impairments to memory function. One of the key histopathological features of AD thought to cause this neurodegeneration is the abnormal aggregation of the protein amyloid-β (Aβ) [1]. Transgenic mouse models that overexpress Aβ are used to investigate the potential functional consequences of this amyloidopathy in AD. In this study we use in vitro electrophysiology data recorded from PDAPP transgenic mice (a mouse model of amyloidopathy) and their wild-type littermates to parameterise a hippocampal network model [2]. The aim of the study is to investigate how amyloidopathy alters gamma frequency oscillations within the hippocampus, which is one of the regions first affected in AD.

We use a synaptically connected network of excitatory pyramidal neurons and inhibitory interneurons to simulate the gamma frequency activity [3]. Each cell is described by a single-compartment Hodgkin–Huxley type equation, with the properties of the voltage-gated channels fit to the intrinsic properties measured experimentally, which included stimulated firing frequency data and the associated action potentials from CA1 pyramidal neurons and three-types of CA1 interneuron. Network activity is either driven deterministically via a direct stimulus, such as a step pulse or a theta wave, or via a stochastic input. We perform power spectral density analysis to analyse the oscillatory activity.

Our model focuses on gamma frequency oscillations, which lie in the 30–100 Hz range, because of the associations with attention, sensory processing and potentially of most relevance to AD, with learning and memory. It has been shown that within the hippocampus gamma oscillations enable cross-talk between distributed cell assemblies, with low frequency gamma associated with coupling between the CA1 and the CA3 region and fast frequency gamma associated with coupling between the CA1 and the medial entorhinal cortex [4]. EEG measurements from AD mouse models have identified network hypersynchrony alongside decreased gamma activity, with the role of interneurons in this process highlighted. [5]. By incorporating the pyramidal neuron and interneuron data in our model we aim to learn more about which parameters are most significant in these effects and to further understanding of the effects of amyloidopathy on oscillatory activity.

Acknowledgements: This work was supported by funding from the EPSRC.

  1. 1.

    Hardy J, Selkoe DJ. The amyloid hypothesis of Alzheimer’s disease: progress and problems on the road to therapeutics. Science. 2002;297:353–6.

  2. 2.

    Kerrigan TL, Brown JT, Randall TL. Characterization of altered intrinsic excitability in hippocampal CA1 pyramidal cells of the Aβ-overproducing PDAPP mouse. Neuropharmacology. 2014;79:515–24.

  3. 3.

    Kopell NJ, Borgers C, Pervouchine D, Maerba P, Tort A. Gamma and theta rhythms in biophysical models of hippocampal circuits. In: Hippocampal microcircuits: a computational modeler’s resource book, chap 15; p. 423–57.

  4. 4.

    Colgin LL, Denninger T, Fyhn M, Hafting T, Bonnevie T, Jensen O, Moser M-B, Moser EI. Frequency of gamma oscillations routes flow of information in the hippocampus. Nature. 2009;462:353–7.

  5. 5.

    Verret L, et al. Inhibitory interneuron deficit links altered network activity and cognitive dysfunction in Alzheimer model. Cell. 2012;149:708–21.


P44 Long-tailed distributions of inhibitory and excitatory weights in a balanced network with eSTDP and iSTDP

Florence I. Kleberg1, Jochen Triesch1

1Frankfurt Institute for Advanced Studies, Frankfurt am Main, Hessen, Germany, 60438

Correspondence: Florence I. Kleberg - kleberg@fias.uni-frankfurt.de

BMC Neuroscience 2016, 17(Suppl 1):P44

The strengths of excitatory synapses in cortex and hippocampus have been shown to follow a rightward-skewed or long-tailed distribution [1,2]. Such distributions can be achieved in recurrent balanced networks [3, 4], after synaptic modification by spike-timing dependent plasticity (STDP) [5] and synaptic scaling [6]. Recently, long-tailed distributions have also been observed for inhibitory synapses in cultured cortical neurons [7], confirming early findings in hippocampal slices [8]. However, the conditions and plasticity mechanisms necessary for achieving long-tailed distributions of inhibitory synapses are unknown. Furthermore, different forms of inhibitory STDP have been reported, but their effect on the distribution of inhibitory synaptic efficacies are largely unknown [9-11].

Here we investigate how plasticity in the inhibitory synapses in a self-organised recurrent neural network (SORN [12]) with leaky integrate-and-fire neurons can lead to long-tailed distributions of synaptic weights. We examine different inhibitory STDP (iSTDP) rules and characterize the conditions under which right-skewed shapes of inhibitory synaptic weight distributions are obtained while a balance between excitation and inhibition is maintained. While the ratio of long-term potentiation to long-term depression in iSTDP affects the shape of the distribution, a variety of window shapes for iSTDP can each achieve long-tailed distributions of inhibitory weights. We find that a precise balance of excitation and inhibition can be achieved with a strongly right-skewed distribution of inhibitory weights. Our results suggest that long-tailed distributions of inhibitory weights could be a ubiquitous feature of neural circuits that employ different plasticity mechanism.

  1. 1.

    Bekkers, JM, Stevens, CF. NMDA and non-NMDA receptors are co-localized at individual excitatory synapses in cultured rat hippocampus. Nature. 1989;341:230–3.

  2. 2.

    Loewenstein Y, Kuras A, Rumpel S. Multiplicative dynamics underlie the emergence of the log-normal distribution of spine sizes in the neocortex in vivo. J Neurosci. 2011;31(26):9481–8.

  3. 3.

    Effenberger F, Jost J, Levina A. Self-organization in balanced state networks by STDP and homeostatic plasticity. PLoS Comput Biol. 2015;11(9):e1004420.

  4. 4.

    Miner D, Triesch J. Plasticity-driven self-organization under topological constraints accounts for non-random features of cortical synaptic wiring. PLoS Comput Biol. 2016;12(2):e1004759.

  5. 5.

    Bi G, Poo M. Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. J Neurosci. 1998;18(24):10464–72.

  6. 6.

    Turrigiano GG, Leslie KR, Desai NS, Rutherford LC, Nelson SB. Activity-dependent scaling of quantal amplitude in neocortical neurons. Nature. 1998;391(6670):892–6.

  7. 7.

    Rubinski A, Ziv NE. Remodeling and tenacity of inhibitory synapses: relationships with network activity and neighboring excitatory synapses. PLoS Comput Biol. 2015;11(11):e1004632.

  8. 8.

    Miles R. Variation in strength of inhibitory synapses in the CA3 region of guinea-pig hippocampus in vitro. J Physiol. 1990;431:659–76.

  9. 9.

    Woodin MA, Ganguly K, Poo M. Coincident pre-and postsynaptic activity modifies GABAergic synapses by postsynaptic changes in Cl− transporter activity. Neuron. 2003;39(5):807–20.

  10. 10.

    Haas JS, Nowotny T, Abarbanel HDI. Spike-timing-dependent plasticity of inhibitory synapses in the entorhinal cortex. J Neurophysiol. 2006;96(6):3305–13.

  11. 11.

    D’Amour JA, Froemke RC. Inhibitory and excitatory spike-timing-dependent plasticity in the auditory cortex. Neuron. 2015;86(2):514–28.

  12. 12.

    Lazar A, Pipa G, Triesch J. SORN: a self-organizing recurrent neural network. Front Comp Neurosci. 2009;3:23.


P45 Simulation of EMG recording from hand muscle due to TMS of motor cortex

Bahar Moezzi1, Nicolangelo Iannella1,4, Natalie Schaworonkow2, Lukas Plogmacher2, Mitchell R. Goldsworthy3, Brenton Hordacre3, Mark D. McDonnell1, Michael C. Ridding3, Jochen Triesch2

1Computational and Theoretical Neuroscience Laboratory, School of Information Technology and Mathematical Sciences, University of South Australia, Australia; 2Frankfurt Institute for Advanced Studies, Goethe-Universität, Germany; 3Robinson Research Institute, School of Medicine, University of Adelaide, Australia; 4School of Mathematical Sciences, University of Nottingham, UK

Correspondence: Bahar Moezzi - bahar.moezzi@unisa.edu.au

BMC Neuroscience 2016, 17(Suppl 1):P45

Single pulse transcranial magnetic stimulation (TMS) is a technique which (at moderate intensities) activates corticomotor neuronal output cells transynaptically and evokes a complex descending volley in the corticospinal tract. Rusu et al. developed a computational model of TMS induced I-waves that reproduced observed epidural recordings in conscious humans [1]. In humans, epidural responses can be recorded in anaesthetized subjects during surgery or conscious subjects with electrodes implanted for the treatment of chronic pain. Such opportunities are uncommon and invasive. The effects of TMS can be non-invasively studied using surface electromyography (EMG) recordings from the hand first dorsal interosseous (FDI) muscle.

We simulated the surface EMG signal due to TMS of motor cortex in the hand FDI muscle. Our model comprises a population of cortical layer 2/3 cells, which drive layer 5 cortico-motoneuronal cells with excitatory and inhibitory synaptic inputs as in [1]. The layer 5 cells in turn project to a pool of motoneurons, which are modeled as an inhomogeneous population of integrate-and-fire neurons to simulate motor unit recruitment and rate coding. The input to motoneurons from cortical layer 5 consists of TMS-induced spikes and baseline firing. We modeled baseline firing with a Poisson drive to layer 2/3 cells. Hermite-Rodriguez functions were used to simulate motor unit action potential shape. The EMG signal was obtained from the summation of motor unit action potentials of active motor units. Parameters were tuned to simulate recordings from the FDI muscle.

Our simulated EMG signals match experimental surface EMG recordings due to TMS of motor cortex in the hand FDI muscle in shape, size and time scale both at rest and during voluntary contraction (see Fig. 31). The simulated EMG traces exhibit cortical silent periods (CSP) that lie within the biological range.
Fig. 31

Comparison of simulated and experimental EMG during A rest, B 10 % maximum voluntary contraction

  1. 1.

    Rusu CV, Murakami M, Ziemann U, Triesch J. A model of TMS-induced I-waves in motor cortex. Brain Stimul. 2014;7(3):401–14.


P46 Structure and dynamics of axon network formed in primary cell culture

Martin Zapotocky1,2, Daniel Smit1,2,3, Coralie Fouquet3, Alain Trembleau3

1Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic; 2Institute of Biophysics and Informatics, First Faculty of Medicine, Charles University in Prague, Czech Republic; 3IBPS, Neuroscience Paris Seine, CNRS UMR8246, Inserm U1130, UPMC UM 119, Université Pierre et Marie Curie, Paris, France

Correspondence: Martin Zapotocky - zapotocky@biomed.cas.cz

BMC Neuroscience 2016, 17(Suppl 1):P46

Axons growing in vivo or in culture may adhere to each other and form a connected network, which subsequently guides the paths of newly arriving axons. We investigated the development of such a network formed by growing axons in primary cell culture.

Olfactory epithelium explants from mouse embryos (day 13–14) were cultured on laminin substrate for 2 days and then recorded using DIC or phase contrast videomicroscopy for up to 24 h. The growing axons established a dense network within which large fascicles of axons were progressively formed. Within the recorded time period, the network remained stable, with limited further gowth of the axons but with ongoing rearrangement in the network structure. Based on segmentation of the recorded images, we determined the principal network characteristics (including the total length, the total number of vertices, and the network anisotropy) and their evolution in time.

This quantitative characterization permitted an analysis of the mechanisms of the observed network coarsening. We relate the network dynamics to the elementary processes of zippering, during which two axons or axon fascicles progressively adhere to each other [1]. We compare the structural features of the network (such as the distribution of vertex angles) with those reported in an electron microscopy investigation of a plexus of sensory neurites in Xenopus embryo [2]. We show that both our ex vivo study and the in vivo study of Ref. [2] support a similar underlying mechanism of the formation of the axon network.

Acknowledgements: Work supported by GAČR 14-16755S, GAUK 396213, MŠMT 7AMB12FR002, NIH 1RO1DCO12441 and ANR 2010-BLAN-1401-01.

  1. 1.

    Smit D, Fouquet C, Pincet F, Trembleau A, Zapotocky M. Axon zippering in neuronal cell culture and its biophysical modeling. BMC Neurosci. 2015;16(Suppl. 1):P298.

  2. 2.

    Roberts A, Taylor JSH. A scanning electron microscope study of the development of a peripheral sensory neurite network. J Embryol Exp Morph. 1982;69:237–50.


P47 Efficient signal processing and sampling in random networks that generate variability

Sakyasingha Dasgupta1,2, Isao Nishikawa3, Kazuyuki Aihara3, Taro Toyoizumi2

1IBM Research - Tokyo, Tokyo, Japan; 2RIKEN Brain Science Institute, Tokyo, Japan; 3The University of Tokyo, Tokyo, Japan

Correspondence: Sakyasingha Dasgupta - sdasgup@jp.ibm.com

BMC Neuroscience 2016, 17(Suppl 1):P47

The source of cortical variability and its influence on signal processing remain an open question. We address the latter, by studying two types of randomly connected networks of quadratic integrate-and-fire neurons with balanced excitation-inhibition that produce irregular spontaneous activity patterns (Fig. 32A): (a) a deterministic network with strong synaptic interactions that actively generates variability by chaotic dynamics (internal noise) and (b) a stochastic network that has weak synaptic interactions but receives noisy input (external noise), e.g. by stochastic vesicle releases. These networks of spiking neurons are analytically tractable in the limit of a large network-size and slow synaptic-time-constant. Despite the difference in their sources of variability, spontaneous (baseline) activity patterns of these two models are indistinguishable unless majority of neurons are simultaneously recorded. We characterize the network behavior with dynamic mean field analysis and reveal a single-parameter family that allows interpolation between the two networks, sharing nearly identical spontaneous activity (Fig. 32B). Despite the close similarity in the spontaneous activity, the two networks exhibit remarkably different sensitivity to external stimuli. Input to the former network reverberates internally and can be successfully read out over long time. Contrarily, input to the latter network rapidly decays and can be read out only for short time. This is also observed in the significant changes in the spiking probability of evoked responses across this family (Fig. 32C). The difference between the two networks is further enhanced if input synapses undergo activity-dependent plasticity, producing significant difference in the ability to decode external input from neural activity. We show that, this difference naturally leads to distinct performance of the two networks to integrate spatio-temporally distinct signals from multiple sources. Unlike its stochastic counterpart, the deterministic chaotic network activity can serve as a reservoir to perform near optimal Bayesian integration and Monte-Carlo sampling from the posterior distribution. We describe implications of the differences between deterministic and stochastic neural computation on population coding and neural plasticity.
Fig. 32

A Schematic illustrations of the two balanced QIF networks models considered in the present study. The left network consists of strongly coupled neurons without noise, while the right network consists of weak coupling among neurons with noisy input. B Nearly identical rate autocorrelation functions in the two networks. The red line (C 0) represents the value of the autocorrelation at time 0 and cyan line (C ) is the value of auto-correlation function in the limit of large t. C Change in spiking probability for different network connectivity strengths \( (\tilde{g}), \) after being stimulated by a brief input at time t = 0

P48 Modeling the effect of riluzole on bursting in respiratory neural networks

Daniel T. Robb1, Nick Mellen2, and Natalia Toporikova3

1Department of Mathematics, Computer Science and Physics, Roanoke College, Salem, VA 24153, USA; 2Department of Pediatrics, University of Louisville, Louisville, KY 40208, USA; 3Department of Biology, Washington and Lee University, Lexington, VA 24450, USA

Correspondence: Daniel T. Robb - robb@roanoke.edu

BMC Neuroscience 2016, 17(Suppl 1):P48

To accommodate constantly changing environmental and metabolic demands, breathing should be able to vary flexibly within a range of frequencies. The respiratory neural network in the pre-Botzinger complex of the ventrolateral medulla controls and flexibly maintains the breathing rhythm, coordinating network-wide bursting to signal the inspiratory phase of the breath. The frequency of this rhythmic activity is controlled by a number of neuromodulators, the majority of which are excitatory. Therefore, the central pattern generator for rhythmic respiratory activity should possess two seemingly contradictory properties: it has to be able to change frequency in response to excitatory input, but it also has to preserve stable rhythmic activity under a wide range of conditions.

A persistent sodium current (I NaP ) been identified as one of the key currents for generation of inspiratory activity [1]. It has been shown that some of the neurons in Pre-BotC possess an intrinsic bursting mechanism, which relies on inactivation of this current. Higher expression of I NaP correlates with higher burst frequency of a single pacemaker neuron [2]. However, the I NaP pacemaker mechanism can only function within very narrow ranges of external excitation—NaP dependent pacemaker tends to switch to tonic firing after a small increase in depolarizing current [3].

In this combined experimental and computational study, we tested the effect of the persistent sodium blocker Riluzole (RIL) in several different levels of continuous depolarization, simulated by application of K+. Whereas increased potassium increases the bursting frequency of the control network, in the presence of RIL the increased potassium does not alter the bursting frequency (Fig. 33). These findings indicate that I NaP is responsible for flexible modulation of respiratory rhythm, but there is another mechanism, which can sustain rhythmic activity in its absence. We developed a computational model which incorporates a Calcium sensitive Non-specific cationic current (I caN ) in addition to I NaP . Our simulations indicate that I caN and I NaP can maintain the rhythm in respiratory neurons in the presence of RIL, and are capable of providing stable oscillations in the presence of tonic excitation by K+.
Fig. 33

Summary of experiment on the effect of riluzole on the dependence of burst frequency on potassium concentration. Without riluzole (left), the frequency increases steadily with increasing potassium concentration. With riluzole present (right), the frequency remains essentially constant with increasing potassium concentration

  1. 1.

    Butera RJ Jr, Rinzel J, Smith JC. Models of respiratory rhythm generation in the pre-Bötzinger complex. I. Bursting pacemaker neurons. J Neurophysiol. 1999;82:382–97.

  2. 2.

    Purvis LK, Smith JC, Koizumi H, Butera RJ. Intrinsic bursters increase the robustness of rhythm generation in an excitatory network. J Neurophysiol. 2007;97:1515–26.

  3. 3.

    Del Negro CA, Morgado-Valle C, Hayes JA, Mackay DD, Pace RW, Crowder EA, Feldman JL. Sodium and calcium current-mediated pacemaker neurons and respiratory rhythm generation. J Neurosci Off J Soc Neurosci. 2005;25:446–53.


P49 Mapping relaxation training using effective connectivity analysis

Rongxiang Tang1, Yi-Yuan Tang2

1Department of Psychology, Washington University in St. Louis, St. Louis, MO 63130, USA; 2Department of Psychological Sciences, Texas Tech University, TX 79409, USA

Correspondence: Yi-Yuan Tang - yiyuan.tang@ttu.edu

BMC Neuroscience 2016, 17(Suppl 1):P49

Relaxation training (RT)is a behavioral therapy that has been applied in stress management, muscle relaxation and other health benefit. However, compared to short-term meditation training, previous studies did not show the significant differences in brain changes following same amount of RT [1,2]. One possible reason might derive from the insensitive correlation based routine functional connectivity method that could not reveal training-related changes in effective connectivity (directed information flow) among these distributed brain regions. Here, we applied a novel spectral dynamic causal modeling (spDCM) to resting state fMRI to characterize changes in effective connectivity.

Twenty-three healthy college students were recruited through campus advertisements and received 4 weeks of RT (10 h in total), previously reported in our randomized studies [1, 2]. All neuroimaging data were collected using an Allegra 3-Telsa Siemens scanner and processed using the Data Processing Assistant for Resting-State fMRI, which is based on SPM and Resting-State fMRI Data Analysis Toolkit [3]. For each participant, the subsequent standard procedures included slice timing, motion correction, regression of WM/CSF signals, and spatial normalization [3]. Based on previous literature, we specified four regions of interest within default mode network (DMN)—medial prefrontal cortex (mPFC), posterior cingulate cortex (PCC), and bilateral inferior parietal lobule (left IPL and right IPL), same coordinates as in previous spDCM studies [4]. A standard DCM analysis involves a specification of plausible models, which are then allows the model parameters to be estimated following Bayesian model selection. In both pre- and post-RT conditions, the procedure selected the fully connected model as the best model with a posterior probability of almost 1. The fully connected model had 24 parameters describing the extrinsic connections between nodes, the intrinsic (self-connections) within nodes and neuronal parameters describing the neuronal fluctuations within each node. We used Bayesian Parametric Average to quantify the differences between pre- and post-RT, and a classical multivariate test—canonical variate analysis to test for any significances in these differences [4]. Our results showed no significant differences in causal relationships among the above nodes following RT (all P > 0.05).

Conclusions Four weeks of RT could not induce significant changes in effective connectivity among DMN nodes. Long-term RT effect on brain changes warrants further investigation.

Acknowledgements: This work was supported by the Office of Naval Research.

  1. 1.

    Tang YY, Holzel BK, Posner MI. The neuroscience of mindfulness meditation. Nat Rev Neurosci. 2015;16:213–25.

  2. 2.

    Tang YY, Lu Q, Geng X, Stein EA, Yang Y, Posner MI. Short-term meditation induces white matter changes in the anterior cingulate. Proc Natl Acad Sci USA. 2010;107:15649–52.

  3. 3.

    Tang YY, Tang R, Posner MI. Brief meditation training induces smoking reduction. Proc Natl Acad Sci USA. 2013;110:13971–75.

  4. 4.

    Razi A, Kahan J, Rees G, Friston KJ. Construct validation of a DCM for resting state fMRI. Neuroimage. 2015;106:1–14.


P50 Modeling neuron oscillation of implicit sequence learning

Guangsheng Liang1, Seth A. Kiser2,3, James H. Howard, Jr.3, Yi-Yuan Tang1

1Department of Psychological Sciences, Texas Tech University, TX 79409, USA; 2The Department of Veteran Affairs, District of Columbia VA Medical Center, Washington, DC 20420, USA; 3Department of Psychology, The Catholic University of America, Washington, DC 20064, USA

Correspondence: Yi-Yuan Tang - yiyuan.tang@ttu.edu

BMC Neuroscience 2016, 17(Suppl 1):P50

Implicit learning (IL) occurs without goal-directed intent or conscious awareness but has important influences on our everyday functioning and overall health such as environmental adaptation, developing habits and aversions. Most of IL studies used event-related potentials (ERPs) to study brain response by taking the grand average of all event-related brain signals. How neuron oscillation (EEG frequency band) involves in IL remains unknown. Moreover, ERP analysis requires brain signals that are not only time locked, but also phase locked to the event, therefore the information with phase locked signals are missed and not presented in potentials. To address this issue, we applied time–frequency analysis and cluster-based permutation test in this study.

Fifteen healthy participants were recruited to perform three sessions of triplets learning task (TLT), an IL task commonly used in the field [1]. Three successive cues were presented and participants were asked to observe the first two cues and only respond to the third cue (target) by pressing corresponding keys. During the task, EEG signals were recorded. Cluster based permutation on alpha and theta band is used to deal with family-wise error rate and in the same time, help to find out difference occurred in specific time range along with spatial information among different triplet types.

Base on the behavioral result, overall learning occurs in session1, while triplet-specific learning takes place in session2. We find significant difference in both alpha (8–13 Hz) and theta (4–8 Hz) frequency band. For alpha band, power modulation shows significant difference between high versus low frequency triplet group in session2 in the frontal cortex. For theta band, theta power shows significant difference between session1 and session3 in the frontal cortex. It started from as early as target onset until the end of the trial in high frequency triplet group. However, in the low frequency triplet group, the power differential occurs later, from around 1000 ms till the end of the next trial.

Conclusions Behavioral result showed that the brain learned the regularity of sequence implicitly. Alpha power modulation indicated that the brain allocated resource in attention among two different triplet types. Theta power modulation showed the difference of memory processing and retrieval among two different triplet types. Our results indicated that participants did not find the regularity of the triplet types till the end of the study, but the brain in fact reacts to these two different triplet types differently.

Acknowledgements: This work was supported by the Office of Naval Research.

  1. 1.

    Howard JH, Howard DV, Dennis N, Kelly AJ. Implicit learning of predictive relationships in three-element visual sequences by young and old adults. J Exp Psychol Learn Mem Cogn. 2008, 34: 1139–57.


P51 The role of cerebellar short-term synaptic plasticity in the pathology and medication of downbeat nystagmus

Julia Goncharenko1, Neil Davey1, Maria Schilstra1, Volker Steuber1

1Centre for Computer Science and Informatics Research, University of Hertfordshire, Hatfield, AL10 9EJ, UK

Correspondence: Julia Goncharenko - i.goncharenko@herts.ac.uk

BMC Neuroscience 2016, 17(Suppl 1):P51

Downbeat nystagmus (DBN) is a common eye fixation disorder that is linked to cerebellar pathology. DBN patients are treated with 4-aminopyridine (4-AP), a K channel blocker, but the underlying mechanism is unclear. DBN is associated with an increased activity of floccular target neurons (FTNs) in the vestibular nuclei. It was previously believed that the reason for the increased activity of FTNs in DBN is a pathological decrease in the spike rate of their inhibitory Purkinje cell inputs, and that the effect of 4-AP in treating DBN could be mediated by an increased Purkinje cell activity, which would restore the inhibition of FTNs and bring their activity back to normal [1]. This assumption, however, has been questioned by in vitro recordings of Purkinje cells from tottering (tg/tg) mice, a mouse model of DBN. It was shown that therapeutic concentrations of 4-AP did not increase the spike rate of the Purkinje cells, but that they restored the regularity of their spiking, which is impaired in tg/tg mice [2].

Prompted by these experiments, Glasauer and colleagues performed computer simulations to investigate the effect of the regularity of Purkinje cell spiking on the activity of FTNs [3]. Using a conductance based FTN model, they found that changes in the regularity of the Purkinje cell input only affected the FTN spike rate when the input was synchronized. In this case, increasing the regularity of the Purkinje cell spiking resulted in larger gaps in the inhibitory input to the FTN and an increased FTN spike rate. These results predict that the increased irregularity in the Purkinje cell activity in DBN should lead to a decreased activity of the FTNs, rather than the increased activity that is found in experiments, and they are therefore unable to explain the therapeutic effect of 4-AP.

However, the model by Glasauer and colleagues does not take short-term depression (STD) at the Purkinje cell—FTN synapses into account. We hypothesized that this absence of STD could explain the apparent contradiction between the experimental [2] and computational [3] results. To study the role of STD in the pathology and 4-AP treatment of DBN, we used a morphologically realistic conductance based model of a cerebellar nucleus (CN) neuron [4, 5] as an FTN model to simulate the effect of irregular versus regular Purkinje cell input. The coefficients of variation of the irregular and regular Purkinje cell spike trains during DBN and after 4-AP treatment, respectively, were taken from recordings from wild-type and tg/tg mice [6], which served as a model system for DBN. We presented the FTN model with synchronized and unsynchronized input and found that, for both conditions, irregular (DBN) input trains resulted in higher FTN spike rates than regular (4-AP) ones. In the presence of unsynchronized Purkinje cell input, the acceleration of the FTN spike output during simulated DBN and the deceleration during simulated 4-AP treatment depended on STD at the Purkinje cell synapses. Our results provide a potential explanation for the pathology and 4-AP treatment of pathological nystagmus.

  1. 1.

    Glasauer S, Kalla R, Buttner U, Strupp M, Brandt T. 4-aminopyridine restores visual ocular motor function in upbeat nystagmus. J Neurol Neurosurg Psychiatry. 2005;76:451–3.

  2. 2.

    Alvina K, Khodakhah K. The therapeutic mode of action of 4-aminopyridine in cerebellar ataxia. J Neurosci. 2010;30:7258–68.

  3. 3.

    Glasauer S, Rössert C, Strupp M. The role of regularity and synchrony of cerebellar Purkinje cells for pathological nystagmus. Ann NY Acad Sci. 2011;1233:162–7.

  4. 4.

    Steuber V, Schultheiss NV, Silver RA, de Schutter E, Jaeger D. Determinants of synaptic integration and heterogeneity in rebound firing explored with data-driven models of deep cerebellar nucleus cells. J Comp Neurosci. 2011;30:633–58.

  5. 5.

    Luthman J, Hoebeek FE, Maex R, Davey N, Adams R, de Zeeuw CI, Steuber V. STD-dependent and independent encoding of input irregularity as spike rate in a computational model of a cerebellar nucleus neuron. Cerebellum. 2011;10:667–82.

  6. 6.

    Hoebeek FE, Stahl JS, van Alphen AM, Schonewille M, Luo C, Rutteman M, van den Maagdenberg AM, Molenaar PC, Goossens HH, Frens MA, et al. Increased noise level of Purkinje cell activities minimizes impact of their modulation during sensorimotor control. Neuron 2005, 45(6):953–965.


P52 Nonlinear response of noisy neurons

Sergej O. Voronenko1,2, Benjamin Lindner1,2

1Department of Physics, Humboldt University, Berlin, 10099, Germany; 2Bernstein Center for Computational Neuroscience, Berlin, 10115, Germany

Correspondence: Sergej O. Voronenko - sergej@physik.hu-berlin.de

BMC Neuroscience 2016, 17(Suppl 1):P52

In many neuronal systems that exhibit high trial-to-trial variability the time-dependent firing rate is thought to be the main information channel for time-dependent signals. However, for nerve cells with low intrinsic noise and highly oscillatory activity synchronization, mode locking and frequency locking seem to be of major importance. Here, we present an extension to the linear response theory [1, 2] for the leaky integrate-and-fire neuron model to second order and demonstrate how the time-dependent firing rate can exhibit features that are reminiscent of mode-locking and frequency-locking. Although our theory allows to predict the response to general weak time-dependent signals, the second-order effects are best demonstrated using cosine signals as in Fig. 34A. We consider a leaky integrate-and-fire model for which the subthreshold voltage, Fig. 34B, is subject to the signal and to Gaussian white noise. Whenever the voltage hits the threshold, it is reset to zero and a spike time is recorded in the raster plot, Fig. 34C. The firing rate can be obtained numerically by averaging over the spike trains or via a perturbation approach similar to the weakly nonlinear analysis in [3]. We find that the firing rate can exhibit pronounced nonlinear behavior as can be seen from the excitation of a harmonic oscillation in Fig. 34D. Further effects that are not shown in Fig. 34 but are revealed by our analysis are a signal-dependent change of the mean firing rate and a pronounced nonlinear response to the sum of two cosine signals.
Fig. 34

Nonlinear modulation of the firing rate by a cosine signal. A Signal, B subthreshold voltage, C rasterplot, D The time-dependent firing rate (red, noisy trace) is significantly different from the linear theory (dashed line) but is accurately described by the second-order response (solid line)

Summary and conclusions Here we demonstrate that the time-dependent firing rate (equivalent to the instantaneous population rate for neurons driven by a common stimulus) can exhibit pronounced nonlinearities even for weak signal amplitudes. The linear theory does not only give quantitatively wrong predictions but also fails to capture the timing of the modulation peaks. Hence, our theory has not only implications for sinusoidal stimulation that is commonly used to study dynamic properties of nerve cells but also demonstrates the relevance of the nonlinear response for the encoding of complex time-dependent signals.

Acknowledgements: This work was supported by the BMBF (FKZ: 01GQ1001A) and the DFG (research training group GRK1589/2).