Skip to main content

Hill-Climbing Algorithm with a Stick for Unconstrained Optimization Problems

  • Yunqing Huang (a1) and Kai Jiang (a1)

Inspired by the behavior of the blind for hill-climbing using a stick to detect a higher place by drawing a circle, we propose a heuristic direct search method to solve the unconstrained optimization problems. Instead of searching a neighbourhood of the current point as done in the traditional hill-climbing, or along specified search directions in standard direct search methods, the new algorithm searches on a surface with radius determined by the motion of the stick. The significant feature of the proposed algorithm is that it only has one parameter, the search radius, which makes the algorithm convenient in practical implementation. The developed method can shrink the search space to a closed ball, or seek for the final optimal point by adjusting search radius. Furthermore our algorithm possesses multi-resolution feature to distinguish the local and global optimum points with different search radii. Therefore, it can be used by itself or integrated with other optimization methods flexibly as a mathematical optimization technique. A series of numerical tests, including high-dimensional problems, have been well designed to demonstrate its performance.

Corresponding author
*Corresponding author. Email: (Y. Q. Huang), (K. Jiang)
Hide All
[1] Sun W. Y. and Yuan Y., Optimization Theory and Methods: Nonlinear Programming, New York: Springer, 2006.
[2] Conn A. R., Gould N. I. M. and Toint P. L., Trust Region Methods, Philadelphia: SIAM, 2000.
[3] Nocedal J. and Wright S. J., Numerical Optimization, Berlin: Springer-Verlag, 2nd ed., 2006.
[4] Conn A. R., Scheinberg K. and Vicente L. N., Introduction to Derivative-Free Optimization, Philadelphia: SIAM, 2009.
[5] Rios L. M. and Sahinidis N. V., Derivative-free optimization: a review of algorithms and comparison of software implementations, J. Global Optim., 56 (2013), pp. 12471293.
[6] Powell M. J. D., UOBYQA: unconstrained optimization by quadratic approximation, Technical Report DAMTP NA2000/14, CMS, University of Cambridge, 2000.
[7] Powell M. J. D., On trust region methods for unconstrained minimization without derivatives, Technical Report DAMTP NA2002/NA02, CMS, University of Cambridge, February 2002.
[8] Wu T., Yang Y., Sun L., and Shao H., A heuristic iterated-subspace minimization method with pattern search for unconstrained optimization, Comput. Math. Appl., 58 (2009), pp. 20512059.
[9] Zhang Z., Sobolev seminorm of quadratic functions with applications to derivative-free optimization, Math. Program., 146 (2014), pp. 7796.
[10] Michalewicz Z. and Fogel D. B., How to Solve It: Modern Heuristics, Springer, 2004.
[11] Lecun Y., Bengio , Hinton Y. and Hinton G., Deep learning, Nature, 521 (2015), pp. 521–436.
[12] Hooke R. and Jeeves T. A., “Direct search” solution of numerical and statistical problems, J. ACM, 8 (1961), pp. 212229.
[13] Lewis R. M., Torczon V. and Trosset M. W., Direct search methods: then and now, J. Comput. Appl. Math., 124 (2000), pp. 191207.
[14] Nelder J. A. and Mead R., A simplex method for function minimization, Comput. J., 7 (1965), pp. 308313.
[15] Torczon V., On the convergence of pattern search algorithms, SIAM J. Optim., 7 (1997), pp. 125.
[16] Kolda T. G., Lewis R. W. and Torczon V., Optimization by direct search: new perspectives on some classical and modern methods, SIAM Rev., 45 (2003), pp. 385482.
[17] Dennis J. E. Jr and Torczon V., Direct search methods on parallel machines, SIAM J. Optim., 1 (1991), pp. 448474.
[18] Dieterich J. M. and Hartke B., Empirical review of standard benchmark functions using evolutionary global optimization, Appl. Math. 3 (2012), pp. 15521564.
[19] Gratton S., Royer C. W., Vicente L. N. and Zhang Z., Direct search based on probabilistic descent, SIAM J. Optim., 25 (2015), pp. 15151541.
[20] Russell S. J. and Norvig P., Artificial Intelligence: a Modern Approach, 3rd ed., Prentice Hall, 2010.
[21] Bäck T., Evolutionary Algorithms in Theory And Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms, Oxford University Press, 1996.
[22] Dennis J. E. Jr and Woods D. J., Optimization on microcomputers: The Nelder-Mead simplex algorithm, In: New Computing Environments: Microcomputers in Large-Scale Computing, Wouk A. ed., Philadelphia: SIAM, 1987.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Advances in Applied Mathematics and Mechanics
  • ISSN: 2070-0733
  • EISSN: 2075-1354
  • URL: /core/journals/advances-in-applied-mathematics-and-mechanics
Please enter your name
Please enter a valid email address
Who would you like to send this to? *



Full text views

Total number of HTML views: 0
Total number of PDF views: 40 *
Loading metrics...

Abstract views

Total abstract views: 189 *
Loading metrics...

* Views captured on Cambridge Core between 9th January 2017 - 22nd January 2018. This data will be updated every 24 hours.