Has it Trained Yet?

A Workshop for Algorithmic Efficiency in Practical Neural Network Training

Friday, December 2, 2022, New Orleans at NeurIPS 2022 in Theater B, hityworkshop@gmail.com
NeurIPS Workshop Website

 

We all think we know how to train neural nets, but we seem to have different ideas. Let’s discuss which methods truly speed up training!

 

News

  • December 08, 2022:
      📤 The results of the poll are now publically available.
  • December 06, 2022:
      📝 A protocol of our workshop is available thanks to Nicholas Teague.
  • December 05, 2022:
      📹 The video recordings of our workshop are now available at the NeurIPS workshop website for all registered members.
  • November 25, 2022:
      📊 Our poll is now open! Let us know how you train your neural networks.
  • September 20, 2022:
      🚨 The submission deadline has been extended to September 30, 2022, 07:00am UTC.
  • August 24, 2022:
      📢 Published the Call for Papers.
  • July 06, 2022:
      ✅ The workshop was accepted at NeurIPS 2022 for an in-person workshop!
  • June 2, 2022:
      🖥️ The workshop website is live!

 

Workshop Description

Training contemporary neural networks is a lengthy and often costly process, both in human designer time and compute resources. Although the field has invented numerous approaches, neural network training still usually involves an inconvenient amount of “babysitting” to get the model to train properly. This not only requires enormous compute resources but also makes deep learning less accessible to outsiders and newcomers. This workshop will be centered around the question “How can we train neural networks faster” by focusing on the effects algorithms (not hardware or software developments) have on the training time of neural networks. These algorithmic improvements can come in the form of novel methods, e.g. new optimizers or more efficient data selection strategies, or through empirical experience, e.g. best practices for quickly identifying well-working hyperparameter settings or informative metrics to monitor during training.

We all think we know how to train deep neural networks, but we all seem to have different ideas. Ask any deep learning practitioner about the best practices of neural network training, and you will often hear a collection of arcane recipes. Frustratingly, these hacks vary wildly between companies and teams. This workshop offers a platform to talk about these ideas, agree on what is actually known, and what is just noise. In this sense, this will not be an “optimization workshop” in the mathematical sense (of which there have been several in the past, of course).

To this end, the workshop’s goal is to connect two communities: Researchers who develop new algorithms for faster neural network training, such as new optimization methods or deep learning architectures. Practitioners who, through their work on real-world problems, are increasingly relying on “tricks of the trade”. The workshop aims to close the gap between research and applications, identifying the most relevant current issues that hinder faster neural network training in practice.

 

Topics

Among the topics addressed by the workshop are:

  • What “best practices” for faster neural network training are used in practice and can we learn from them to build better algorithms?
  • What are painful lessons learned while training deep learning models?
  • What are the most needed algorithmic improvements for neural network training?
  • How can we ensure that research on training methods for deep learning has practical relevance?

 

Important Dates

  • Submission Deadline:
      September 30, 2022, 07:00am UTC (updated!)
  • Accept/Reject Notification Date:
      October 20, 2022, 07:00am UTC (updated!)
  • Workshop Date:
      December 2, 2022

 

Workshop Poll

For the workshop we conducted a survey to gather insights on the current training practices within the community. The poll was distributed through the workshop website, Twitter, email, and at the in-person NeurIPS event.

The survey’s results can be found in the PDF in this repository.