Profile picture of Alexander Immer I am a research scientist at Bioptimus, where I work on multimodal and multiscale foundation models for biological data. Previously, I was a PhD student in computer science at ETH Zürich and the Max Planck Institute for Intelligent Systems, and a student researcher on the applied science team at Google Research, advised by Gunnar Rätsch and Bernhard Schölkopf and supported by the CLS fellowship.

Before that, I completed my MSc in computer science at EPFL as a research scholar with Patrick Thiran and Matthias Grossglauser, and spent the final year of my MSc as an intern at RIKEN AIP Tokyo with Emtiyaz Khan.

I am interested in deep learning and probabilistic methods and their applications to science.

Contact: mail AT aleximmer.com

Publications

Forecasting Whole-Brain Neuronal Activity from Volumetric Video
A Immer, JM Lueckmann, ABY Chen, PH Li, MD Petkova, NA Iyer, A Dev, G Ihrke, W Park, A Petruncio, A Weigel, W Korff, F Engert, JW Lichtman, MB Ahrens, V Jain, M Januszewski
arXiv, 2025
ZAPBench: A Benchmark for Whole-Brain Activity Prediction in Zebrafish
JM Lueckmann, A Immer, ABY Chen, PH Li, MD Petkova, NA Iyer, LW Hesselink, A Dev, G Ihrke, W Park, A Petruncio, A Weigel, W Korff, F Engert, JW Lichtman, M Januszewski, V Jain
ICLR, 2025
Influence Functions for Scalable Data Attribution in Diffusion Models
B Mlodozeniec, R Eschenhagen, J Bae, A Immer, D Krueger, R Turner
ICLR, 2025
Shaving weights with Occam's razor: Bayesian sparsification for neural networks using the marginal likelihood
R Dhahri, A Immer, B Charpentier, S Günnemann, V Fortuin
NeurIPS, 2024
Advances in Bayesian Model Selection and Uncertainty Estimation for Deep Learning
A Immer
ETH Zurich (Doctoral Thesis), 2024
Probabilistic pathway-based multimodal factor analysis
A Immer, SG Stark, F Jacob, X Bonilla, T Thomas, A Kahles, S Goetze, ES Milani, B Wollscheid, G Rätsch, KV Lehmann
ISMB, 2024
Towards Training Without Depth Limits: Batch Normalization Without Gradient Explosion
A Meterez, A Joudaki, F Orabona, A Immer, G Rätsch, H Daneshmand
ICLR, 2024
Hodge-Aware Contrastive Learning
A Möllers, A Immer, V Fortuin, E Isufi
ICASSP, 2024
Position: Bayesian Deep Learning is Needed in the Age of Large-Scale AI
T Papamarkou, M Skoularidou, K Palla, L Aitchison, J Arbel, D Dunson, M Filippone, V Fortuin, P Hennig, JM Hernández-Lobato, A Hubin, A Immer, T Karaletsos, ME Khan, A Kristiadi, Y Li, S Mandt, C Nemeth, MA Osborne, TGJ Rudner, D Rügamer, YW Teh, M Welling, AG Wilson, R Zhang
ICML, 2024
Effective Bayesian Heteroscedastic Regression with Deep Neural Networks
A Immer*, E Palumbo*, A Marx, JE Vogt
NeurIPS, 2023
Kronecker-Factored Approximate Curvature for Modern Neural Network Architectures
R Eschenhagen, A Immer, RE Turner, F Schneider, P Hennig
NeurIPS, 2023
Learning Layer-wise Equivariances Automatically using Gradients
T v.d. Ouderaa, A Immer, M v.d. Wilk
NeurIPS, 2023
Stochastic Marginal Likelihood Gradients using Neural Tangent Kernels
A Immer, T v.d. Ouderaa, M v.d. Wilk, G Rätsch, B Schölkopf
ICML, 2023
On the Identifiability and Estimation of Causal Location-Scale Noise Models
A Immer, C Schultheiss, JE Vogt, B Schölkopf, P Bühlmann, A Marx
ICML, 2023
Invariance Learning in Deep Neural Networks with Differentiable Laplace Approximations
A Immer*, T v.d. Ouderaa*, G Rätsch, V Fortuin, M v.d. Wilk
NeurIPS, 2022
Probing as Quantifying Inductive Bias
A Immer*, LT Hennigen*, V Fortuin, R Cotterell
ACL, 2022
Laplace Redux – Effortless Bayesian Deep Learning
E Daxberger*, A Kristiadi*, A Immer*, R Eschenhagen*, M Bauer, P Hennig
NeurIPS, 2021
Scalable Marginal Likelihood Estimation for Model Selection in Deep Learning
A Immer, M Bauer, V Fortuin, G Rätsch, ME Khan
ICML, 2021
Improving predictions of Bayesian neural networks via local linearization
A Immer*, M Korzepa, M Bauer*
AISTATS, 2021
Continual Deep Learning by Functional Regularisation of Memorable Past
P Pan, S Swaroop, A Immer, R Eschenhagen, R Turner, ME Khan
NeurIPS 2020 (oral), 2020
Sub-Matrix Factorization for Real-Time Vote Prediction
A Immer, V Kristof, M Grossglauser, P Thiran
KDD (oral), 2020
Disentangling the Gauss-Newton Method and Approximate Inference for Neural Networks
A Immer
EPFL MSc Thesis, 2020
Approximate Inference Turns Deep Networks into Gaussian Processes
ME Khan, A Immer, E Abedi, M Korzepa
NeurIPS, 2019
Efficient Learning of Smooth Probability Functions from Bernoulli Tests with Guarantees
P Rolland, A Kavis, A Immer, A Singla, V Cevher
ICML, 2019
Variational Inference with Numerical Derivatives: Variance Reduction through Coupling
A Immer, GP Dehaene
Preprint, 2019