Events

Efficient Parallel Hyperparameter Optimization with OmniOpt on HPCTraining

by Norman Koch (ScaDS.AI), Dr Peter Winkler (ScaDS.AI)

Europe/Berlin
1020 (APB)

1020

APB

Living Lab, TU Dresden
Description

The optimization of hyperparameters is an important task when applying neural networks as well as other machine-learning methods and classical simulations. Examples for hyperparameters of neural networks are the type of network to be used, the number of its layers, the number of neurons in a layer, number and size of filters, learning rate, batch size, type of activation functions and many more. Finding an appropriate set of hyperparameters is crucial for the accuracy and performance of an application and is usually a very time-consuming a tedious task if done manually. 

 

In this training the hyperparameter optimization tool OmniOpt is introduced. OmniOpt is a tailor-made solution for the high performance computing (HPC) cluster Taurus of TU Dresden which allows to optimize the hyperparameters of a wide range of problems. The use of the HPC system Taurus with its vast resources of GPUs, CPUs and storage assures that even large problems may be handled within a moderate computation time. Moreover, there is a variety of tools for the automatic analysis and graphical representation of the optimization results. 

 

Aim of this training is to enable the participants to use OmniOpt on their own. It will contain 

  1. a short general introduction into hyperparameter optimization,
  2. a brief introduction in the use of the HPC system Taurus, 
  3. an extended introduction into OmniOpt using a hands-on example 
  4. evaluating the results of the optimization by the OmniOpt toolkit. 

The training is suitable for researchers as well as for students with basic knowledge in Linux and a command-line based programming language, e.g. Python or C. Researchers who bring their own code to be optimized, are highly welcome. This is, however, not a prerequisite.

Organized by

Trainings ScaDS.AI

Registration
Participants