Paul Bürkner
paulbuerkner.com
Paul Bürkner
@paulbuerkner.com
Full Professor of Computational Statistics at TU Dortmund University

Scientist | Statistician | Bayesian | Author of brms | Member of the Stan and BayesFlow development teams

Website: https://paulbuerkner.com

Opinions are my own
the logging is done in rstan. so a fix if needed will have to be there I assume.
June 1, 2025 at 6:21 PM
there is not unfortunately. I didn't have time to look into it anymore.
May 23, 2025 at 6:24 PM
Reposted by Paul Bürkner
I defended my PhD last week ✨

Huge thanks to:
• My supervisors @paulbuerkner.com @stefanradev.bsky.social @avehtari.bsky.social 👥
• The committee @ststaab.bsky.social @mniepert.bsky.social 📝
• The institutions @ellis.eu @unistuttgart.bsky.social @aalto.fi 🏫
• My wonderful collaborators 🧡

#PhDone 🎓
March 27, 2025 at 6:31 PM
can you post a reprex on github?
March 21, 2025 at 7:29 PM
Reposted by Paul Bürkner
What advice do folks have for organising projects that will be deployed to production? How do you organise your directories? What do you do if you're deploying multiple "things" (e.g. an app and an api) from the same project?
February 27, 2025 at 2:15 PM
Reposted by Paul Bürkner
Amortized inference for finite mixture models ✨

The amortized approximator from BayesFlow closely matches the results of expensive-but-trustworthy HMC with Stan.

Check out the preprint and code by @kucharssim.bsky.social and @paulbuerkner.com👇
Finite mixture models are useful when data comes from multiple latent processes.

BayesFlow allows:
• Approximating the joint posterior of model parameters and mixture indicators
• Inferences for independent and dependent mixtures
• Amortization for fast and accurate estimation

📄 Preprint
💻 Code
February 11, 2025 at 8:53 AM
Reposted by Paul Bürkner
Finite mixture models are useful when data comes from multiple latent processes.

BayesFlow allows:
• Approximating the joint posterior of model parameters and mixture indicators
• Inferences for independent and dependent mixtures
• Amortization for fast and accurate estimation

📄 Preprint
💻 Code
February 11, 2025 at 8:48 AM
Reposted by Paul Bürkner
If you know simulation based calibration checking (SBC), you will enjoy our new paper "Posterior SBC: Simulation-Based Calibration Checking Conditional on Data" with Teemu Säilynoja, @marvinschmitt.com and @paulbuerkner.com
arxiv.org/abs/2502.03279 1/7
February 6, 2025 at 10:11 AM
Reposted by Paul Bürkner
A study with 5M+ data points explores the link between cognitive parameters and socioeconomic outcomes: The stability of processing speed was the strongest predictor.

BayesFlow facilitated efficient inference for complex decision-making models, scaling Bayesian workflows to big data.

🔗Paper
February 3, 2025 at 12:21 PM
cool idea! I will think about how to achieve something like this. can you open an issue on GitHub so I don't forget aboht it?
January 29, 2025 at 8:26 AM
Reposted by Paul Bürkner
Join us this Thursday for a talk on efficient mixture and multilevel models with neural networks by @paulbuerkner.com at the new @approxbayesseminar.bsky.social!
A reminder of our talk this Thursday (30th Jan), at 11am GMT. Paul Bürkner (TU Dortmund University), will talk about "Amortized Mixture and Multilevel Models". Sign up at listserv.csv.warwick... to receive the link.
January 28, 2025 at 5:06 AM
Reposted by Paul Bürkner
Paul Bürkner (TU Dortmund University), will give our next talk. This will be about "Amortized Mixture and Multilevel Models", and is scheduled on Thursday the 30th January at 11am. To receive the link to join, sign up at listserv.csv.warwick...
January 14, 2025 at 12:00 PM
Reposted by Paul Bürkner
Paul Bürkner (@paulbuerkner.com) will talk about amortized Bayesian multilevel models in the next Approximate Bayes Seminar on January 30 ⭐️

Sign up to the seminar’s mailing list below to get the meeting link 👇
Paul Bürkner (TU Dortmund University), will give our next talk. This will be about "Amortized Mixture and Multilevel Models", and is scheduled on Thursday the 30th January at 11am. To receive the link to join, sign up at listserv.csv.warwick...
January 14, 2025 at 12:43 PM
Reposted by Paul Bürkner
More than 60 German universities and research outfits are announcing that they will end their activities on twitter.

Including my alma mater, the University of Münster.

HT @thereallorenzmeyer.bsky.social nachrichten.idw-online.de/2025/01/10/h...
Hochschulen und Forschungsinstitutionen verlassen Plattform X - Gemeinsam für Vielfalt, Freiheit und Wissenschaft
nachrichten.idw-online.de
January 10, 2025 at 12:02 PM
Reposted by Paul Bürkner
what are your best tips to fit shifted lognormal models (in #brms / Stan)? I'm using:
- checking the long tails (few long RTs make the tail estimation unwieldy)
- low initial values for ndt
- careful prior checks
- pathfinder estimation of initial values
still with increasing data, chains get stuck
January 10, 2025 at 10:43 AM
I think this should be documented in the brms_families vignette. perhaps you can double check if the information you are looking for is indeed there.
January 9, 2025 at 5:54 PM
happy to work with you on that if we find the time :-)
January 3, 2025 at 1:00 PM
indeed, I saw it at StanCon but I am not sure anymore how production ready the method was.
January 2, 2025 at 5:26 PM
something like this, yes. but ensuring the positive definiteness of arbitrary constraint correlation matrices is not trivial. so there may need to be some restrictions of what correlation patterns are allowed.
January 2, 2025 at 7:53 AM
I already thought about this. a complete SEM syntax in brms would support selective error correlations by generalizing set_rescor()
December 23, 2024 at 4:47 PM
Reposted by Paul Bürkner
OK, here is a very rough draft of a tutorial for #Bayesian #SEM using #brms for #rstats. It needs work, polish, has a lot of questions in it, and I need to add a references section. But, I think a lot of folk will find this useful, so.... jebyrnes.github.io/bayesian_sem... (use issues for comments!)
Full Luxury Bayesian Structural Equation Modeling with brms
jebyrnes.github.io
December 21, 2024 at 7:49 PM
Reposted by Paul Bürkner
Ternary plots to represent data in a simplex, yay or nay? #stats #statistics #neuroscience
December 17, 2024 at 11:09 PM
Reposted by Paul Bürkner
Writing is thinking.

It’s not a part of the process that can be skipped; it’s the entire point.
This is what's so baffling about so many suggestions for AI in the humanities classroom: they mistake the product for the point. Writing outlines and essays is important not because you need to make outlines and essays but because that's how you learn to think with/through complex ideas.
I'm sure many have said this before but I'm reading a student-facing document about how students might use AI in the classroom (if allowed) and one of the recs is: use AI to make an outline of your reading! But ISN'T MAKING THE OUTLINE how one actually learns?
December 12, 2024 at 3:08 PM
Reposted by Paul Bürkner
1️⃣ An agent-based model simulates a dynamic population of professional speed climbers.
2️⃣ BayesFlow handles amortized parameter estimation in the SBI setting.

📣 Shoutout to @masonyoungblood.bsky.social & @sampassmore.bsky.social

📄 Preprint: osf.io/preprints/ps...
💻 Code: github.com/masonyoungbl...
December 10, 2024 at 1:34 AM
Reposted by Paul Bürkner
Neural superstatistics are a framework for probabilistic models with time-varying parameters:

⋅ Joint estimation of stationary and time-varying parameters
⋅ Amortized parameter inference and model comparison
⋅ Multi-horizon predictions and leave-future-out CV

📄 Paper 1
📄 Paper 2
💻 BayesFlow Code
December 6, 2024 at 12:21 PM