Publications

Recent publications and preprints in reverse chronological order. For the latest updates, see my Google Scholar profile.

2025

  1. Accelerating neural network training: An analysis of the AlgoPerf competition
    Priya Kasimbeg*Frank Schneider*, Runa Eschenhagen, and 11 more authors
    In International Conference on Learning Representations (ICLR), 2025
    We analyze the results of the inaugural AlgoPerf competition

2024

  1. lr-lgf.png
    Efficient Weight-Space Laplace-Gaussian Filtering and Smoothing for Sequential Deep Learning
    Joanna Sliwa, Frank Schneider, Nathanael Bosch, and 2 more authors
    2024
    We propose a Laplace-Gaussian filtering and smoothing framework for sequential deep learning
  2. soil.png
    How can we quantify, explain, and apply the uncertainty of complex soil maps predicted with neural networks?
    Kerstin Rau, Katharina Eggensperger, Frank Schneider, and 2 more authors
    Science of The Total Environment, 2024
    We use a last-layer Laplace approximation to quantify uncertainty in soil maps predicted with neural networks

2023

  1. Kronecker-Factored Approximate Curvature for Modern Neural Network Architectures
    Runa Eschenhagen, Alexander Immer, Richard Turner, and 2 more authors
    In Neural Information Processing Systems (NeurIPS), 2023
    We extend Kronecker-Factored Approximate Curvature to generic modern neural network architectures
  2. iterglm.png
    Accelerating Generalized Linear Models by Trading off Computation for Uncertainty
    Lukas Tatzel, Jonathan Wenger, Frank Schneider, and 1 more author
    2023
    We propose a new method to accelerate training of Generalized Linear Models by trading off computation for uncertainty
  3. algoperf-benchmark.jpg
    Benchmarking Neural Network Training Algorithms
    George E. Dahl, Frank Schneider, Zachary Nado, and 22 more authors
    2023
    We motivate, present, and justify our new AlgoPerf Training Algorithms benchmark

2022

  1. late-phase.png
    Late-Phase Second-Order Training
    Lukas Tatzel, Philipp Hennig, and Frank Schneider
    In Has it Trained Yet? NeurIPS 2022 Workshop, 2022
    We show that performing a few costly but precise second-order steps can outperform first-order alternatives in wall-clock runtime
  2. phd-thesis.png
    Understanding Deep Learning Optimization via Benchmarking and Debugging
    Frank Schneider
    University of Tübingen, 2022
    Ph.D. Thesis

2021

  1. Cockpit: A Practical Debugging Tool for the Training of Deep Neural Networks
    Frank Schneider*, Felix Dangel*, and Philipp Hennig
    In Neural Information Processing Systems (NeurIPS), 2021
    We introduce a visual and statistical debugger specifically designed for deep learning helping to understand the dynamics of neural network training
  2. Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers
    Robin M. Schmidt*Frank Schneider*, and Philipp Hennig
    In International Conference on Machine Learning (ICML), 2021
    We empirically compared fifteen popular deep learning optimizers

2019

  1. DeepOBS: A Deep Learning Optimizer Benchmark Suite
    Frank Schneider, Lukas Balles, and Philipp Hennig
    In International Conference on Learning Representations (ICLR), 2019
    We provide a software package that drastically simplifies, automates, and improves the evaluation of deep learning optimizers

2018

  1. igf-journal.png
    Inverse generating function approach for the preconditioning of Toeplitz-block systems
    Frank Schneider, and Maxim Pisarenco
    Numerical Linear Algebra with Applications, 2018
    We propose a new preconditioner for Toeplitz-block matrices based on the inverse generating function
  2. BTTB-patent.png
    Methods and Apparatus for Calculating Electromagnetic Scattering Properties of a Structure and for Reconstruction of Approximate Structures
    Maxim Pisarenco, Frank Schneider, Maria Van, and 2 more authors
    2018
    US Patent App. 15/839,299
    We propose two new preconditioners for multi-level Toeplitz matrices

2016

  1. msc-thesis.png
    Approximations of Inverses of BTTB Matrices
    Frank Schneider
    Technische Universiteit Eindhoven (TU/e), 2016
    Master’s Thesis
    We suggest several techniques to approximate the inverse of a BTTB matrix with the goal of designing preconditioners for linear systems of this form.