
-
Carnegie
- radevs@rpi.edu
- https://orcid.org/0000-0002-6702-9559
Research
My research follows two streams. The first focuses on developing new statistical methods through generative AI. The second focuses on building and applying computational models of complex processes (with a special focus on cognition) to gain insights from data (and sometimes Big Data). These two streams converge at the BayesFlow
framework for Bayesian inference with generative AI (GitHub)(project page), of which I am the core developer and maintainer.
In the new computer age, modern Bayesian inference allows us to estimate, validate, and draw substantive conclusions from probabilistic models. Typical problems in Bayesian analysis are:
- Estimate the posterior distribution of hidden parameters from noisy data;
- Compare competing models in terms of their complexity and predictive performance;
- Emulate the observable behavior of a system or predict future behavior under uncertainty.
However, despite their theoretical appeal and utility, Bayesian workflows are limited by severe computational bottlenecks: Analyzing even a single data set may already eat up days of computation, so that model validation and prediction become completely infeasible.
BayesFlow
addresses these challenges by training neural networks on synthetic data to solve a sim2real problem. Researchers can then re-use and share these networks for rapid inference or other downstream tasks.
If you are a student interested in joining the BayesOps Lab and contributing to our research, we are always excited to hear from motivated individuals who want to work on:
- Cutting-edge Bayesian modeling and probabilistic inference
- Novel neural network architectures and generative AI
- Open and reproducible research software for computational science
- Cognitive modeling and behavioral data science
The lab is currently supported by the National Science Foundation (NSF).
Teaching
I teach courses revolving around computational modeling, probabilistic inference, and statistics.
- Thursday 2:00 - 3:00PM
COGS 6960 / PSYC 4960: Bayesian Data Analysis
Description: Bayesian statistics provides a principled framework for inference and uncertainty quantification. The course aims at building a strong foundation in the basic principles and methods of Bayesian statistics and probabilistic modeling. The course will introduce students to the vast landscape of computational methods for Bayesian inference and enable them to apply these methods to challenging real-world problems. The course includes multiple hands-on sessions and applied problems using the popular R programming language.
Recognition
- Composing the score for stable and simulation-efficient amortized inference at scale. (2025). Invited talk at the STAMPS Workshop on Neural Simulation-Based Inference. Carnegie Mellon University, Pittsburgh, PA, USA.
- From model building to mechanistic insight with amortized Bayesian inference. (2025) Invited talk at the Society for Decision Making under Deep Uncertainty. Virtual event.
- Introduction to BayesFlow. (2025). Invited talk at the Methods Week. Karolinska Institute, Stockholm, Sweden.
- Choosing simulation over prediction and explanation in psychology. (2024). Invited talk at the Academy for Theoretical Psychology and Psychological Philosophy. Virtual event.
- Jointly amortized neural approximation of complex Bayesian models. (2023). Invited talk at the Bayesian Computation Reading Group. Flatiron Institute. New York, NY, USA.
- More expressive amortized Bayesian inference via joint learning and self-consistency? (2023). Invited talk at the One World Approximate Bayesian Computation (ABC) Seminar. Virtual event.
- Deep learning meets cognitive modeling: A new paradigm? (2023). Invited talk at the Cognitive Science Series. Heidelberg University, Germany.
I simulate, therefore I understand? Simulacrum and explanation. Invited talk at the Conference on Interdisciplinary Research in Philosophy and Psychology (AG Philosophie und Psychologie). Cologne, Germany.
Publications
Recent Preprints
- Arruda, J., Pandey, V., Sherry, C., Barroso, M., Intes, X., Hasenauer, J., & Radev, S. T. (2025). Compositional amortized inference for large-scale hierarchical Bayesian models. arXiv preprint arXiv:2505.14429. (arXiv)
- Bürkner, P. C., Schmitt, M., & Radev, S. T. (2025). Simulations in statistical workflows. arXiv preprint arXiv:2503.24011. (arXiv)
- Elsemüller, L., Pratz, V., von Krause, M., Voss, A., Bürkner, P. C., & Radev, S. T. (2025). Does unsupervised domain adaptation improve the robustness of amortized Bayesian inference? A systematic evaluation. arXiv preprint arXiv:2502.04949. (arXiv)
- von Krause, M., & Radev, S. T. (2025). A big data analysis of the associations between cognitive parameters and socioeconomic outcomes. OSF preprint. (OSF)
Recent ML Conference Proceedings
- Elsemüller, L., Olischläger, H., Schmitt, M., Bürkner, P. C., Köthe, U., & Radev, S. T. (2024). Sensitivity-aware amortized Bayesian inference. ICLR. (arXiv)(OpenReiview)
- Schmitt, M., Pratz, V., Köthe, U., Bürkner, P. C., & Radev, S. T. (2024). Consistency models for scalable and fast simulation-based inference. NeurIPS. (arXiv)(OpenReview)
- Schmitt, M., Li, C., Vehtari, A., Acerbi, L., Bürkner, P. C., & Radev, S. T. (2024). Amortized Bayesian Workflow. Bayesian Decision-making and Uncertainty Workshop, NeurIPS. (arXiv)
- Schmitt, M., Habermann, D., Bürkner, P. C., Köthe, U., & Radev, S. T. (2024). Leveraging self-consistency for data-efficient amortized Bayesian inference. Proceedings of the 41st International Conference on Machine Learning (ICML), 43723-43741. (arXiv)(OpenReview)
- Schmitt, M., Habermann, D., Bürkner, P. C., Köthe, U., & Radev, S. T. (2023). Leveraging self-consistency for data-efficient amortized Bayesian inference. UniReps Workshop, NeurIPS, New Orleans. (arXiv)
Recent Journal Papers
- Bockting, F., Radev, S. T., & Bürkner, P. C. (2025). Expert-elicitation method for non-parametric joint priors using normalizing flows: F. Bockting et al. Statistics and Computing, 35(5), 132.
- Schumacher, L., Schnuerch, M., Voss, A., & Radev, S. T. (2025). Validation and comparison of non-stationary cognitive models: A diffusion model application. Computational Brain & Behavior, 8(2), 191-210.
- Prillinger, K., de Lara, G. A., Klöbl, M., Lanzenberger, R., Plener, P. L., Poustka, L., ... & Radev, S. T. (2024). Multisession tDCS combined with intrastimulation training improves emotion recognition in adolescents with autism spectrum disorder. Neurotherapeutics, e00460.
- Elsemüller, L., Schnuerch, M., Bürkner, P. C., & Radev, S. T. (2024). A deep learning method for comparing Bayesian hierarchical models. Psychological Methods.
- Bockting, F., Radev, S. T., & Bürkner, P. C. (2024). Simulation-based prior knowledge elicitation for parametric Bayesian models. Scientific Reports, 14(1), 17330.