AlgoPerf Workshop

2025 | Meta HQ, Menlo Park

event thumbnail

Information

  • Location: MPK 11, Hacker Way, Menlo Park, USA
  • Date: February 2025
  • Talk Title: The Results of the Inaugural AlgoPerf Competition
  • Organizers: M. Shi, P. Kasimbeg, F. Schneider, P. Hennig, G. Lakshminarayanan, N. King, G. Dahl, D. Kanter, A. Mikes
  • Website: https://algoperf-workshop.github.io/
  • Slides:
  • Note: Co-organized the AlgoPerf Workshop at Meta HQ at Menlo Park.

My Talk: The Results of the Inaugural AlgoPerf Competition

The goal of the AlgoPerf: Training Algorithms competition is to evaluate practical speed-ups in neural network training achieved solely by improving the underlying training algorithms. In the external tuning ruleset, submissions must provide workload-agnostic hyperparameter search spaces, while in the self-tuning ruleset they must be completely hyperparameter-free. In both rulesets, submissions are compared on time-to-result across multiple deep learning workloads, training on fixed hardware. This paper presents the inaugural AlgoPerf competition's results, which drew 18 diverse submissions from 10 teams. Our investigation reveals several key findings: (1) The winning submission in the external tuning ruleset, using Distributed Shampoo, demonstrates the effectiveness of non-diagonal preconditioning over popular methods like Adam, even when compared on wall-clock runtime. (2) The winning submission in the self-tuning ruleset, based on the Schedule Free AdamW algorithm, demonstrates a new level of effectiveness for completely hyperparameter-free training algorithms. (3) The top-scoring submissions were surprisingly robust to workload changes. We also discuss the engineering challenges encountered in ensuring a fair comparison between different training algorithms. These results highlight both the significant progress so far, and the considerable room for further improvements.

Slides