Machine learning best practices: Autotune models to avoid local minimum breakdowns


This is the fifth post in my series of machine learning best practices. 

Hyperparameters are the algorithm options one "turns and tunes" when building a learning model.  Hyperparameters cannot be learned using that algorithm. So, these parameters need to be assigned before training of the model.  A lot of manual efforts in machine learning is spent finding the optimal set of hyperparameters for our model. How can you find a suitable hyperparameter set more efficiently than using trial and error?

There are several ways to solve this problem by automatically tuning, or autotuning, the parameters, including:

  1. Grid search: Grid search is simply an exhaustive search through a manually specified subset of the hyperparameters space. It must be guided by some performance metric, like measured by cross-validation on the training set or evaluation on a held-out validation set. Start from even-spaced start points, compute the objective functions of these points and select the smallest one as a solution. This is not very realistic when the parameter space is huge.
  2. Bayesian optimization. Bayesian optimization is a methodology for the global optimization of noisy black-box functions. Applied to hyperparemeters optimization, Bayesian optimization consists of developing a statistical model of the function from hyperparameter values to the objective evaluated on a validation set.

For more ways to autotune the hyperparameters, read this autotune paper by my colleagues at SAS, or check out his blog post about hyperparameter tuning.

My next post will be about managing the temporal effect. If there are other tips you want me to cover, or if you have tips of your own to share, leave a comment here. You can read the whole series by clicking on the image below.


About Author

Wayne Thompson

Manager Data Science Technologies

Wayne Thompson, Chief Data Scientist at SAS, is a globally renowned presenter, teacher, practitioner and innovator in the fields of data mining and machine learning. He has worked alongside the world's biggest and most challenging organizations to help them harness analytics to build high performing organizations. Over the course of his 24 year tenure at SAS, Wayne has been credited with bringing to market landmark SAS analytics technologies, including SAS Text Miner, SAS Credit Scoring for Enterprise Miner, SAS Model Manager, SAS Rapid Predictive Modeler, SAS Visual Statistics and more. His current focus initiatives include easy to use self-service data mining tools along with deep learning and cognitive computing tool kits.

Related Posts


  1. Pingback: Machine learning best practices: Put your models to work - Subconscious Musings

Back to Top