JGI Seed Corn Funding Project Blog 2021: Conor Houghton

Bayesian methods in Neuroscience – Conor Houghton

For the last century science has relied on a statistical framework based on hypothesis testing and frequentist inference. Despite its convenience in simple contexts this approach has proved to be intricate, obtuse and sometimes misleading when applied to more difficult problems, particularly problems with the sort of large, complex and untidy datasets that are vital for applications like climate modelling, finance, bioinformatics, epidemiology and neuroscience.

Bayesian inference solves this; the Bayesian approach is easy to interpret and returns science to its traditional reliance on evidence and description rather than a false notion of significance and truth. With a rigorous handling of uncertainty Bayesian inference can dramatically improve statistical efficiency, allowing us to squeeze more insight out of finite, hard-won data which in turn reduces animal and biological tissue use and reduces costs for scientific projects.

With support from the Jean Golding Institute we ran a workshop about Bayesian Modelling: our workshop had lots of different elements, a tutorial for people unfamiliar with the approach, short talks by people in the University who use these methods, a few talks by external speakers and a data study group. In retrospect, we did try to do too much, but the workshop was very helpful, the short talks brought together the local community around Bayesian Modelling and the two external speakers, Hong Ge and Mike Peardon, were excellent and provided real unexpected insight into the current and potential future state of Bayesian Modelling.

We hope to next host a workshop on Hybrid / Hamiltonian Monte Carlo; HMC has quickly become a very useful tool in data science, allowing us to perform Bayesian inference for a host of real world problems that would not have been tractable a few years ago. Perhaps surprisingly, HMC has its origins in high energy particle physics and was invented to perform the high-dimensional integrals involved in Quantum Chromodynamics, the calculations required to predict the results of collider experiments in CERN.

We believe that there is a still a lot these two communities  – particle physics and applied data science – can learn from each other when exploring and developing the power and scope of HMC.