Working Paper
Demba Ba. Working Paper. “Deeply-sparse signal representations”. arXiv Version
Andrew H. Song, Leon Chlon, Hugo Soulat, John Tauber, Sandya Subramanian, Demba Ba, and Michael J. Prerau. Submitted. “Multitaper Infinite Hidden Markov Model for EEG.” In International Engineering in Medicine and Biology Conference (EMBC) 2019.Abstract

Electroencephalographam (EEG) monitoring of neural activity is widely used for identifying underlying brain states. For inference of brain states, researchers have often used Hidden Markov Models (HMM) with a fixed number of hidden states and an observation model linking the temporal dynamics embedded in EEG to the hidden states. The use of fixed states may be limiting, in that 1) pre-defined states might not capture the heterogeneous neural dynamics across individuals and 2) the oscillatory dynamics of the neural activity are not directly modeled. To this end, we use a Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM), which discovers the set of hidden states that best describes the EEG data, without a-priori specification of state number. In addition, we introduce an observation model based on classical asymptotic results of frequency domain properties of stationary time series, along with the description of the conditional distributions for Gibbs sampler inference. We then combine this with multitaper spectral estimation to reduce the variance of the spectral estimates. By applying our method to simulated data inspired by sleep EEG, we arrive at two main results: 1) the algorithm faithfully recovers the spectral characteristics of the true states, as well as the right number of states and 2) the incorporation of the multitaper framework produces a more stable estimate than traditional periodogram spectral estimates.

PDF Version
Andrew H. Song, Francisco Flores, and Demba Ba. Submitted. “Spike sorting by convolutional dictionary learning.” Advances in Neural Information Processing 31. arXiv Version
Bahareh Tolooshams, Sourav Dey, and Demba Ba. Submitted. “Deep Residual Auto-Encoders for Expectation Maximization-based Dictionary Learning.” IEEE Transactions on Neural Networks and Learning Systems. arXiv VersionAbstract
Convolutional dictionary learning (CDL) has become a popular method for learning sparse representations from data. State-of-the-art algorithms perform dictionary learning (DL) through an optimization-based alternating-minimization procedure that comprises a sparse coding and a dictionary update step respectively. Here, we draw connections between CDL and neural networks by proposing an architecture for CDL termed the constrained recurrent sparse auto-encoder (CRsAE). We leverage the interpretation of the alternating-minimization algorithm for DL as an Expectation-Maximization algorithm to develop auto-encoders (AEs) that, for the first time, enable the simultaneous training of the dictionary and regularization parameter. The forward pass of the encoder, which performs sparse coding, solves the E-step using an encoding matrix and a soft-thresholding non-linearity imposed by the FISTA algorithm. The encoder in this regard is a variant of residual and recurrent neural networks. The M-step is implemented via a two-stage back-propagation. In the first stage, we perform back-propagation through the AE formed by the encoder and a linear decoder whose parameters are tied to the encoder. This stage parallels the dictionary update step in DL. In the second stage, we update the regularization parameter by performing back-propagation through the encoder using a loss function that includes a prior on the parameter motivated by Bayesian statistics. We leverage GPUs to achieve significant computational gains relative to state-of-the-art optimization-based approaches to CDL. We apply CRsAE to spike sorting, the problem of identifying the time of occurrence of neural action potentials in recordings of electrical activity from the brain. We demonstrate on recordings lasting hours that CRsAE speeds up spike sorting by 900x compared to notoriously slow classical algorithms based on convex optimization.
PDF Version
Alexander Lin, Yingzhou Zhang, Jeremy Heng, Stephen A. Allsop, Kay M. Tye, Pierre E. Jacob, and Demba Ba. 2019. “Clustering Time Series with Nonlinear Dynamics: A Bayesian Non-Parametric and Particle-Based Approach.” In International Conference on Artificial Intelligence and Statistics (AISTATS) 2019.Abstract

We propose a general statistical framework for clustering multiple time series that exhibit nonlinear dynamics into an a-priori unknown number of sub-groups. Our motivation comes from neuroscience, where an important problem is to identify, within a large assembly of neurons, subsets that respond similarly to a stimulus or contingency. Upon modeling the multiple time series as the output of a Dirichlet process mixture of nonlinear state-space models, we derive a Metropolis-within-Gibbs algorithm for full Bayesian inference that alternates between sampling cluster assignments and sampling parameter values that form the basis of the clustering. The Metropolis step employs recent innovations in particle-based methods. We apply the framework to clustering time series acquired from the prefrontal cortex of mice in an experiment designed to characterize the neural underpinnings of fear.

PDF Version
Noa Malem-Shiniski, Yingzhuo Zhang, Daniel T. Gray, Sarah N. Burke, Anne C. Smith, Carol A. Barnes, and Demba. Ba. 9/1/2018. “A separable two-dimensional random field model of binary response data from multi-day behavioral experiments.” Journal of Neursocience Methods, 307, Pp. 175-187. Publisher's Version
Bahareh Tolooshams, Sourav Dey, and Demba Ba. 9/2018. “Scalable convolutional dictionary learning with constrained recurrent sparse auto-encoders.” 2018 IEEE 28th International Worskhop on Machine Learning and Signal Processing. Aalborg, Denmark: IEEE. arXiv Version
Yinghzuo Zhang, Noa Malem-Shinitski, Stephen A. Allsop, Kay Tye, and Demba Ba. 4/2018. “Estimating a separably-Markov random field (SMuRF) from binary observations.” Neural Computation, 30, 4, Pp. 1046-1079. Publisher's Version yingzhuo_zhang.pdf
Stephen A Allsop, Romy Wichmann, Fergil Mills, Anthony Burgos-Robles, Chia-Jung Chang, Ada C. Felix-Ortiz, Alienor Vienne, Anna Beyeler, Ehsan M. Izadmehr, Gordon Glober, Meghan I. Cum, Johanna Stergiadou, Kavitha K. Anandalingham, Kathryn Farris, Praneeth Namburi, Christopher A. Leppla, Javier C. Weddington, Edward H. Nieh, Anne C. Smith, Demba Ba, Emery N. Brown, and Kay M. Tye. 3/3/2018. “Corticoamygdala transfer of socially derived information gates observational learning.” Cell, 173, 6, Pp. 1329-1342. Publisher's Version
Gabriel Schamberg, Demba Ba, and Todd P Coleman. 2/15/2018. “A modularized efficient framework for non-Markov time-series estimation.” IEEE Transactions on Signal Processing, 66, 12. Publisher's Version
Seong-Eun Kim, Michael Behr, Demba Ba, and Emery N. Brown. 1/2/2018. “State-space multitaper time-frequency analysis.” Proceedings of the National Academy of Sciences, 115, 1. Publisher's Version
Noa Shinitski, Yingzhuo Zhang, Daniel T Gray, Sarah N Burke, Anne C Smith, Carol A Barnes, and Demba Ba. 2017. “Can you teach an old monkey a new trick?” Cosyne 2017. shinitski_cosyne_2017.pdf
Yingzhuo Zhang, Noa Shinitski, Stephen Allsop, Kay Tye, and Demba Ba. 2017. “A Two-Dimensional Seperable Random Field Model of Within and Cross-Trial Neural Spiking Dynamics.” Cosyne 2017. zhang_cosyne_2017.pdf
Gabriel Schamberg, Demba Ba, Mark Wagner, and Todd Coleman. 2016. “Efficient low-rank spectrotemporal decomposition using ADMM.” In Statistical Signal Processing Workshop (SSP), 2016 IEEE, Pp. 1–5. IEEE.
Jonathan D Kenny, Jessica J Chemali, Joseph F Cotten, Christa J Van Dort, Seong-Eun Kim, Demba Ba, Norman E Taylor, Emery N Brown, and Ken Solt. 2016. “Physostigmine and Methylphenidate Induce Distinct Arousal States During Isoflurane General Anesthesia in Rats.” Anesthesia and analgesia.
Gabriela Czanner, Sridevi V Sarma, Demba Ba, Uri T Eden, Wei Wu, Emad Eskandar, Hubert H Lim, Simona Temereanca, Wendy A Suzuki, and Emery N Brown. 2015. “Measuring the signal-to-noise ratio of a neuron.” Proceedings of the National Academy of Sciences, 112, 23, Pp. 7141–7146.
Emery Neal Brown, Demba Ba, and Anne Caroline Smith. 2015. “System And Method For Real-Time Analysis And Guidance Of Learning”.
Demba Ba, Simona Temereanca, and Emery N Brown. 2014. “Algorithms for the analysis of ensemble neural spiking activity using simultaneous-event multivariate point-process models.” Frontiers in computational neuroscience, 8, Pp. 6.
EN Brown, D Ba, and S Temereanca. 2014. “Algorithms for the Analysis of Ensemble Neural Spiking Activity Using Simultaneous-Event Multivariate Point-Process Models.” Name: Frontiers in Computational Neuroscience, 8, 6.
Demba Ba, Behtash Babadi, Patrick L Purdon, and Emery N Brown. 2014. “Convergence and stability of iteratively re-weighted least squares algorithms.” IEEE Transactions on Signal Processing, 62, 1, Pp. 183–195.