Shortcuts

Hyperparameter Tuning With Determined

Hyperparameter tuning is a common machine learning workflow that involves appropriately configuring the data, model architecture, and learning algorithm to yield an effective model. Hyperparameter tuning is a challenging problem in deep learning given the potentially large number of hyperparameters to consider. Determined provides support for hyperparameter search as a first-class workflow that is tightly integrated with Determined’s job scheduler, which allows for efficient execution of state-of-the-art early-stopping based approaches as well as seamless parallelization of these methods.

Other Supported Methods

Determined also supports other common hyperparameter search algorithms:

  1. Single is appropriate for manual hyperparameter tuning, as it trains a single hyperparameter configuration.

  2. Grid brute force evaluates all possible hyperparameter configurations and returns the best.

  3. Random evaluates a set of hyperparameter configurations chosen at random and returns the best.

  4. Population-based training (PBT) begins as random search but periodically replaces low-performing hyperparameter configurations with ones near the high-performing points in the hyperparameter space.