This chapter addresses the global optimization of nonconvex NLP and MINLP optimization problems. The use of convexification transformations is first introduced that allow us to transform a nonconvex NLP into a convex NLP. This is illustrated with geometric programming problems that involve posynomals, and that can be convexified with exponential transformations. We consider next the more general solution approach that relies on the use of convex envelopes that can predict rigorous lower bounds to the global optimum, and which are used in conjunction with a spatial branch and bound method. The case of bilinear NLP problems is addressed as a specific example for which the McCormick convex envelopes are derived. The application of the spatial branch and bound search, coupled with McCormick envelopes, is illustrated with a small example. The software BARON, ANTIGONE, and SCIOP are briefly described.
Review the options below to login to check your access.
Log in with your Cambridge Higher Education account to check access.
If you believe you should have access to this content, please contact your institutional librarian or consult our FAQ page for further information about accessing our content.