Ten tips before you run an optimization


Optimization is a primary tool of computational statistics. SAS/IML software provides a suite of nonlinear optimizers that makes it easy to find an optimum for a user-defined objective function. You can perform unconstrained optimization, or define linear or nonlinear constraints for constrained optimization.

Over the years I have seen many questions about optimization (which is also call nonlinear programming (NLP)) posted to discussion forums such as the SAS/IML Support Community. A typical question describes a program that is producing an error, will not converge, or is running exceedingly slow. Such problems can be exceedingly frustrating, and some programmers express their frustration by punctuating their questions with multiple exclamation points:

  • "Why is the Newton-Raphson (NLPNRA) routine broken?!!!"
  • "Why is SAS/IML taking forever to solve this problem????!!!!!"

To paraphrase Shakespeare, in most cases the fault lies not in our optimization routines, but in ourselves. The following checklist describes 10 common SAS/IML issues that lead to errors or lack of convergence in nonlinear optimization. If you check this list before attempting to optimize a function, then you increase the chances that the SAS/IML optimization routines will produce an optimal solution quickly and efficiently.

10 ways to avoid errors and improve efficiency in nonlinear #optimization. #SAStip Click To Tweet
  1. Thoroughly test and debug the objective function before you optimize. This is the most common reason that people post questions to a discussion forum. The programmer reports that the optimization routine is giving an error message, but the real problem is that there is a logical error in the objective function. When the optimizer calls the objective function, BLAM!—the program stops with an error. The solution is to test the objective function before you call the optimizer. Make sure that the objective function produces correct results for a range of possible input values.

  2. Vectorize the objective function, when possible. The objective function will be called many times. Therefore make sure it is as efficient as possible. In a matrix-vector language such as SAS/IML, you should vectorize, which means replacing loops by using a vector computation.

  3. Provide a good initial guess. Unfortunately, many programmers don't put much thought into the initial guess. They use a vector of zeros or ones or assign some arbitrary values. The initial guess is important. Newton's method converges quadratically to a solution if the initial guess is close to the solution. A good guess might converge in a few iterations, whereas a bad guess might not converge at all or might require dozens of iterations. You can use graphical techniques to find an initial guess. If you are running a constrained optimization, ensure that the initial guess satisfies the constraints.

  4. Specify derivatives correctly. Some optimizers (such as Newton's method) use derivative information to find an optimum of a smooth objective function. If your optimization is running slowly or not converging, you might want to specify the analytical derivatives. I've written a previous article about how to specify derivatives and ensure that they are correct. If you do not write a derivative function, SAS/IML will use finite differences. Finite differences require evaluating the objective function many times, which can be slow. For objective functions that are extremely flat near the optimum, finite differences can be inaccurate or lead to convergence issues.

  5. Use global variables to specify constants. In the SAS/IML language, the objective function for the NLP routines accepts a single vector of parameters. The optimizer tries to adjust the values of those parameters in order to optimize the objective function, subject to any constraints. But sometimes the objective function requires additional (constant) parameters. For example, in maximum likelihood estimation, the objective function needs access to the data. When you define the objective function, use the GLOBAL statement to specify the data and other unchanging parameters.

  6. Check your data and your model. Sometimes there is nothing wrong with the objective function, it is the data that is causing the problem. In maximum likelihood estimation (MLE), the data is part of the optimization. Degenerate data, constant data, collinearities, and singular matrices can arise when you attempt to fit an inappropriate model to data. Lack of convergence could mean that the model does not fit the data. Furthermore, not every MLE problem has a solution! Use a combination of research and visualization to make sure that a solution exists and is not degenerate.

  7. During the debugging phase, print the iteration history. When you use the NLP routines in SAS/IML, you create a vector of options that controls the optimizer algorithm. The second element of this vector specifies the amount of output that you want to create. When I am debugging a problem, I usually specify a value of 4, which means that the optimizer prints the iteration history and other details. By examining the output, you can often discover whether the initial guess is converging to an optimal value, is bouncing around, or is diverging. If the optimization is not converging, examining the iteration history is often the first step to understanding why.

  8. Define constraints correctly. In many statistical problems, parameters are constrained. For example, in maximum likelihood estimation, scale parameters must be positive. The simplest linear-constraint matrix has two rows and a column for each parameter. The first row represents the lower bound for each parameter and the second row represents the upper bound. Put the value 0 in the first row for parameters that must be nonnegative, or use a small number such as 1E-6 if you need to bound the parameter away from zero. The SAS/IML documentation shows how you can add additional rows to specify additional constraints. For nonlinear constraints, you can use the NLC= option to specify a module that defines the feasible region.

  9. Trap out-of-domain conditions in the objective function. When you use the NLC= option to define nonlinear constraints, the final solution should lie in the feasible region. However, the optimizer is entitled to evaluate the objective function in the infeasible region. Consequently, if the objective function contains logs, square roots, or other functions that are undefined in some regions of parameter space, you should trap and handle bad parameter values within the objective function. If you are maximizing the objective function, return a large negative value (such as -1E6) when the input parameters are outside the domain of the function.

  10. Do not attempt 10,000 optimizations until you can successfully complete one optimization. Sometimes optimizations are part of a simulation study. Make sure that the optimization is robust and completes successfully for one simulated sample. Then try five samples. Then 100. Time the performance of your simulation on 100 samples and use that time to estimate the run time for 10,000 samples. If you encounter a simulated sample for which the optimization does not converge, save that sample so that you can carefully analyze it and discover why the optimization failed. Remember that you can check the return code from the NLP routines to determine if the optimization converged.

Optimization can be challenging, but you can increase the chance of success if you follow the tips and strategies in this checklist. These tips help you to avoid common pitfalls. By following these suggestions, your SAS/IML optimization will be more efficient, more accurate, and less frustrating.


About Author

Rick Wicklin

Distinguished Researcher in Computational Statistics

Rick Wicklin, PhD, is a distinguished researcher in computational statistics at SAS and is a principal developer of SAS/IML software. His areas of expertise include computational statistics, simulation, statistical graphics, and modern methods in statistical data analysis. Rick is author of the books Statistical Programming with SAS/IML Software and Simulating Data with SAS.

1 Comment

  1. Pingback: Ten posts from 2016 that deserve a second look - The DO Loop

Leave A Reply

Back to Top