Learning Bayesian Statistics

Alexandre Andorra

A podcast on Bayesian inference - the methods, the projects and the people who make it possible!

  • 1 hour 24 minutes
    #148 Adaptive Trials, Bayesian Thinking, and Learning from Data, with Scott Berry

    • Support & get perks!

    • Proudly sponsored by PyMC Labs. Get in touch and tell them you come from LBS!

    Intro to Bayes and Advanced Regression courses (first 2 lessons free)

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work !

    Chapters:

    13:16 Understanding Adaptive and Platform Trials

    25:25 Real-World Applications and Innovations in Trials

    34:11 Challenges in Implementing Bayesian Adaptive Trials

    42:09 The Birth of a Simulation Tool

    44:10 The Importance of Simulated Data

    48:36 Lessons from High-Stakes Trials

    52:53 Navigating Adaptive Trial Designs

    56:55 Communicating Complexity to Stakeholders

    01:02:29 The Future of Clinical Trials

    01:10:24 Skills for the Next Generation of Statisticians

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Giuliano Cruz, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Joshua Meehl, Javier Sabio, Kristian Higgins, Matt Rosinski, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire, Mike Loncaric, Ronald Legere, Sergio Dolia, Michael Cao, Yiğit Aşık, Suyog Chandramouli, Guillaume Berthon, Avenicio Baca, Spencer Boucher, Krzysztof Lechowski, Danimal, Jácint Juhász, Sander and Philippe.

    Links from the show:

    30 December 2025, 10:20 am
  • 21 minutes 59 seconds
    BITESIZE | Making Variational Inference Reliable: From ADVI to DADVI

    Today’s clip is from episode 147 of the podcast, with Martin Ingram.

    Alex and Martin discuss the intricacies of variational inference, particularly focusing on the ADVI method and its challenges. They explore the evolution of approximate inference methods, the significance of mean field variational inference, and the innovative linear response technique for covariance estimation.

    The discussion also delves into the trade-offs between stochastic and deterministic optimization techniques, providing insights into their implications for Bayesian statistics.

    Get the full discussion here.


    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Transcript

    This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

    17 December 2025, 11:00 am
  • 1 hour 9 minutes
    #147 Fast Approximate Inference without Convergence Worries, with Martin Ingram

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!


    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    • DADVI is a new approach to variational inference that aims to improve speed and accuracy.
    • DADVI allows for faster Bayesian inference without sacrificing model flexibility.
    • Linear response can help recover covariance estimates from mean estimates.
    • DADVI performs well in mixed models and hierarchical structures.
    • Normalizing flows present an interesting avenue for enhancing variational inference.
    • DADVI can handle large datasets effectively, improving predictive performance.
    • Future enhancements for DADVI may include GPU support and linear response integration.

    Chapters:

    13:17 Understanding DADVI: A New Approach

    21:54 Mean Field Variational Inference Explained

    26:38 Linear Response and Covariance Estimation

    31:21 Deterministic vs Stochastic Optimization in DADVI

    35:00 Understanding DADVI and Its Optimization Landscape

    37:59 Theoretical Insights and Practical Applications of DADVI

    42:12 Comparative Performance of DADVI in Real Applications

    45:03 Challenges and Effectiveness of DADVI in Various Models

    48:51 Exploring Future Directions for Variational Inference

    53:04 Final Thoughts and Advice for Practitioners

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël...

    12 December 2025, 11:00 am
  • 19 minutes 12 seconds
    BITESIZE | Why Bayesian Stats Matter When the Physics Gets Extreme

    Today’s clip is from episode 146 of the podcast, with Ethan Smith.

    Alex and Ethan discuss the application of Bayesian inference in high energy density physics, particularly in analyzing complex data sets. They highlight the advantages of Bayesian techniques, such as incorporating prior knowledge and managing uncertainties.

    They also shares insights from an ongoing experimental project focused on measuring the equation of state of plasma at extreme pressures. Finally, Alex and Ethan advocate for best practices in managing large codebases and ensuring model reliability.

    Get the full discussion here.


    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Transcript

    This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

    5 December 2025, 6:00 am
  • 1 hour 35 minutes
    #146 Lasers, Planets, and Bayesian Inference, with Ethan Smith

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!


    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    • Ethan's research involves using lasers to compress matter to extreme conditions to study astrophysical phenomena.
    • Bayesian inference is a key tool in analyzing complex data from high energy density experiments.
    • The future of high energy density physics lies in developing new diagnostic technologies and increasing experimental scale.
    • High energy density physics can provide insights into planetary science and astrophysics.
    • Emerging technologies in diagnostics are set to revolutionize the field.
    • Ethan's dream project involves exploring picno nuclear fusion.

    Chapters:

    14:31 Understanding High Energy Density Physics and Plasma Spectroscopy

    21:24 Challenges in Data Analysis and Experimentation

    36:11 The Role of Bayesian Inference in High Energy Density Physics

    47:17 Transitioning to Advanced Sampling Techniques

    51:35 Best Practices in Model Development

    55:30 Evaluating Model Performance

    01:02:10 The Role of High Energy Density Physics

    01:11:15 Innovations in Diagnostic Technologies

    01:22:51 Future Directions in Experimental Physics

    01:26:08 Advice for Aspiring Scientists

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady,

    27 November 2025, 11:00 am
  • 19 minutes 34 seconds
    BITESIZE | How to Thrive in an AI-Driven Workplace?

    Today’s clip is from episode 145 of the podcast, with Jordan Thibodeau.

    Alexandre Andorra and Jordan Thibodeau discuss the transformative impact of AI on productivity, career opportunities in the tech industry, and the intricacies of the job interview process.

    They emphasize the importance of expertise, networking, and the evolving landscape of tech companies, while also providing actionable advice for individuals looking to enhance their careers in AI and related fields.

    Get the full discussion here.


    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Transcript

    This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

    20 November 2025, 11:00 am
  • 1 hour 52 minutes
    #145 Career Advice in the Age of AI, with Jordan Thibodeau

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!


    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Thank you to my Patrons for making this episode possible!

    Yusuke Saito, Avi Bryant, Giuliano Cruz, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Aubrey Clayton, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Joshua Meehl, Javier Sabio, Kristian Higgins, Matt Rosinski, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström, Stefan, Corey Abshire, Mike Loncaric, David McCormick, Ronald Legere, Sergio Dolia, Michael Cao, Yiğit Aşık, Suyog Chandramouli and Guillaume Berthon.

    Takeaways:

    • AI is reshaping the workplace, but we're still in early stages.
    • Networking is crucial for job applications in top firms.
    • AI tools can augment work but are not replacements for skilled labor.
    • Understanding the tech landscape requires continuous learning.
    • Timing and cultural readiness are key for tech innovations.
    • Expertise can be gained without formal education.
    • Bayesian statistics is a valuable skill for tech professionals.
    • The importance of personal branding in the job market. You just need to know 1% more than the person you're talking to.
    • Sharing knowledge can elevate your status within a company.
    • Embracing chaos in tech can create new opportunities.
    • Investing in people leads...
    12 November 2025, 11:00 am
  • 19 minutes
    BITESIZE | Why is Bayesian Deep Learning so Powerful?

    Today’s clip is from episode 144 of the podcast, with Maurizio Filippone.

    In this conversation, Alex and Maurizio delve into the intricacies of Gaussian processes and their deep learning counterparts. They explain the foundational concepts of Gaussian processes, the transition to deep Gaussian processes, and the advantages they offer in modeling complex data.

    The discussion also touches on practical applications, model selection, and the evolving landscape of machine learning, particularly in relation to transfer learning and the integration of deep learning techniques with Gaussian processes.

    Get the full discussion here.


    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Transcript

    This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

    5 November 2025, 11:00 am
  • 1 hour 28 minutes
    #144 Why is Bayesian Deep Learning so Powerful, with Maurizio Filippone

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    • Why GPs still matter: Gaussian Processes remain a go-to for function estimation, active learning, and experimental design – especially when calibrated uncertainty is non-negotiable.
    • Scaling GP inference: Variational methods with inducing points (as in GPflow) make GPs practical on larger datasets without throwing away principled Bayes.
    • MCMC in practice: Clever parameterizations and gradient-based samplers tighten mixing and efficiency; use MCMC when you need gold-standard posteriors.
    • Bayesian deep learning, pragmatically: Stochastic-gradient training and approximate posteriors bring Bayesian ideas to neural networks at scale.
    • Uncertainty that ships: Monte Carlo dropout and related tricks provide fast, usable uncertainty – even if they’re approximations.
    • Model complexity ≠ model quality: Understanding capacity, priors, and inductive bias is key to getting trustworthy predictions.
    • Deep Gaussian Processes: Layered GPs offer flexibility for complex functions, with clear trade-offs in interpretability and compute.
    • Generative models through a Bayesian lens: GANs and friends benefit from explicit priors and uncertainty – useful for safety and downstream decisions.
    • Tooling that matters: Frameworks like GPflow lower the friction from idea to implementation, encouraging reproducible, well-tested modeling.
    • Where we’re headed: The future of ML is uncertainty-aware by default – integrating UQ tightly into optimization, design, and deployment.

    Chapters:

    08:44 Function Estimation and Bayesian Deep Learning

    10:41 Understanding Deep Gaussian Processes

    25:17 Choosing Between Deep GPs and Neural Networks

    32:01 Interpretability and Practical Tools for GPs

    43:52 Variational Methods in Gaussian Processes

    54:44 Deep Neural Networks and Bayesian Inference

    01:06:13 The Future of Bayesian Deep Learning

    01:12:28 Advice for Aspiring Researchers

    30 October 2025, 11:00 am
  • 23 minutes 14 seconds
    BITESIZE | Are Bayesian Models the Missing Ingredient in Nutrition Research?

    Today’s clip is from episode 143 of the podcast, with Christoph Bamberg.

    Christoph shares his journey into Bayesian statistics and computational modeling, the challenges faced in academia, and the technical tools used in research.

    Alex and Christoph delve into a specific study on appetite regulation and cognitive performance, exploring the implications of framing in psychological research and the importance of careful communication in health-related contexts.

    Get the full discussion here.


    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Transcript

    This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.

    23 October 2025, 11:00 am
  • 1 hour 12 minutes
    #143 Transforming Nutrition Science with Bayesian Methods, with Christoph Bamberg

    Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!


    Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

    Visit our Patreon page to unlock exclusive Bayesian swag ;)

    Takeaways:

    • Bayesian mindset in psychology: Why priors, model checking, and full uncertainty reporting make findings more honest and useful.
    • Intermittent fasting & cognition: A Bayesian meta-analysis suggests effects are context- and age-dependent – and often small but meaningful.
    • Framing matters: The way we frame dietary advice (focus, flexibility, timing) can shape adherence and perceived cognitive benefits.
    • From cravings to choices: Appetite, craving, stress, and mood interact to influence eating and cognitive performance throughout the day.
    • Define before you measure: Clear definitions (and DAGs to encode assumptions) reduce ambiguity and guide better study design.
    • DAGs for causal thinking: Directed acyclic graphs help separate hypotheses from data pipelines and make causal claims auditable.
    • Small effects, big implications: Well-estimated “small” effects can scale to public-health relevance when decisions repeat daily.
    • Teaching by modeling: Helping students write models (not just run them) builds statistical thinking and scientific literacy.
    • Bridging lab and life: Balancing careful experiments with real-world measurement is key to actionable health-psychology insights.
    • Trust through transparency: Openly communicating assumptions, uncertainty, and limitations strengthens scientific credibility.

    Chapters:

    10:35 The Struggles of Bayesian Statistics in Psychology

    22:30 Exploring Appetite and Cognitive Performance

    29:45 Research Methodology and Causal Inference

    36:36 Understanding Cravings and Definitions

    39:02 Intermittent Fasting and Cognitive Performance

    42:57 Practical Recommendations for Intermittent Fasting

    49:40 Balancing Experimental Psychology and Statistical Modeling

    55:00 Pressing Questions in Health Psychology

    01:04:50 Future Directions in Research

    Thank you to my Patrons for...

    15 October 2025, 11:00 am
  • More Episodes? Get the App