Search SpacesHyperparameter optimization begins by deciding what may vary.
Grid SearchGrid search is one of the simplest methods for hyperparameter optimization.
Random SearchRandom search is a hyperparameter optimization method that samples configurations at random from a search space.
Bayesian OptimizationBayesian optimization is a hyperparameter optimization method for expensive black-box functions. It is useful when each training run costs enough that random search wastes too much compute.
Population-Based TrainingPopulation-based training, or PBT, is a hyperparameter optimization method that trains many models at the same time.
Neural Architecture SearchNeural architecture search, or NAS, is the process of automatically searching for model architectures.
Automated Machine LearningAutomated machine learning, or AutoML, refers to systems that automate parts of the model development process.