Greatest Hits

Hierarchical Recurrent State Space Models of Neural Activity

With Annika Nichols, David Blei, Manuel Zimmer, and Liam Paninski. bioRxiv, 2019

We develop hierarchical and recurrent state space models for whole brain recordings of neural activity in C. elegans. We find states of brain activity that correspond to discrete elements of worm behavior and dynamics that are modulated by brain state and sensory input.

Tree-structured Recurrent SLDS

With Josue Nassar, Monica Bugallo, and Il Memming Park. ICLR, 2019

We develop an extension of the rSLDS to capture hierarchical, multi-scale structure in dynamics via a tree-structured stick-breaking model. We recursively partition the latent space to obtain a piecewise linear approximation of nonlinear dynamics. A hierarchical prior smooths dynamics estimates, and inference is performed via an augmented Gibbs sampling algorithm.

Point process latent variable models of larval zebrafish behavior

With Anuj Sharma, Robert Johnson, and Florian Engert. NeurIPS 2018

We develop deep state space models with point process observation models to capture structure in larval zebrafish behavior. The models combine discrete and continuous latent variables. We marginalize the discrete states with message passing and perform inference with bidirectional LSTM recognition networks.

Variational Sequential Monte Carlo

With Christian Naesseth, Rajesh Ranganath, and David Blei. AISTATS 2018

We view SMC as a variational family indexed by the parameters of its proposal distribution and show how this generalizes the importance weighted autoencoder. As the number of particles goes to infinity, the variational approximation approaches the true posterior.

Rejection Sampling Variational Inference

With Christian Naesseth, Fran Ruiz, and David Blei. AISTATS 2017
Best Paper Award

Reparameterization gradients through rejection samplers for automatic variational inference in models with gamma, beta, and Dirichlet latent variables.

Recurrent Switching Linear Dynamical Systems

With Matt Johnson, Andy Miller, Ryan Adams, David Blei, and Liam Paninski. AISTATS, 2017

Bayesian learning and inference for models with co-evolving discrete and continuous latent states.

Dependent Multinomial Models Made Easy

With Matt Johnson and Ryan Adams. NIPS 2015

We use a stick-breaking construction and Pólya-gamma augmentation to derive block Gibbs samplers for linear Gaussian models with multinomial observations.

Studying Synaptic Plasticity with Time-Varying GLMs

With Chris Stock and Ryan Adams. NIPS 2014

We propose a time-varying generalized linear model whose weights evolve according to synaptic plasticity rules, and we perform Bayesian inference with particle MCMC.