A dai-liao hybrid conjugate gradient method for unconstrained optimization
Keywords:Unconstrained optimization, Dai-Liao parameter, Sufficient descent condition, Global convergence, Conjugate gradient algorithm
AbstractOne of todaysâ€™ best-performing CG methods is Dai-Liao (DL) method which depends on non-negative parameter Â and conjugacy conditions for its computation. Although numerous optimal selections for the parameter were suggested, the best choice of Â remains a subject of consideration. The pure conjugacy condition adopts an exact line search for numerical experiments and convergence analysis. Though, a practical mathematical experiment implies using an inexact line search to find the step size. To avoid such drawbacks, Dai and Liao substituted the earlier conjugacy condition with an extended conjugacy condition. Therefore, this paper suggests a new hybrid CG that combines the strength of Liu and Storey and Conjugate Descent CG methods by retaining a choice of Dai-Liao parameterthat is optimal. The theoretical analysis indicated that the search direction of the new CG scheme is descent and satisfies sufficient descent condition when the iterates jam under strong Wolfe line search. The algorithm is shown to converge globally using standard assumptions. The numerical experimentation of the scheme demonstrated that the proposed method is robust and promising than some known methods applying the performance profile Dolan and MorÂ´e on 250 unrestricted problems.Â Numerical assessment of the tested CG algorithms with sparse signal reconstruction and image restoration in compressive sensing problems, file restoration, image video coding and other applications. The result shows that these CG schemes are comparable and can be applied in different fields such as temperature, fire, seismic sensors, and humidity detectors in forests, using wireless sensor network techniques.
Andrei, N. (2008b). A hybrid conjugate gradient algorithm for unconstrained optimization as
convex combination of Hestenes- Stiefel and Dai-Yuan. Studies in Informatics and Control, 17, 55-70.
Andrei, N. (2008c). An unconstrained optimization test function collection. Advanced Modeling and Optimization, 10, 147-161.
Andrei, N. (2009). Hybrid conjugate gradient algorithm for unconstrained optimization. Journal of Optimization Theory Application, 141, 249-264.
Babaie-Kafaki, S., Ghanbari, R. & Mahdavi-Amiri, N. (2010). Two new conjugate gradient methods based on modified secant equations. Journal of Computational and Applied Mathematics , 234, 1374-1386.
Babaie-Kafaki, S. (2011). A modified BFGS algorithm based on a hybrid secant equation. Science China Mathematics, 54, (9), 2019-2036.
Babaie-Kafaki, S., Fatemi, M., & Mahdavi-Amiri, N. (2011). Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Science China Math., 58, 315-331.
Babaie-Kafaki, S. (2013). A hybrid conjugate gradient method based on quadratic relaxation of Dai-Yuan hybrid conjugate gradient parameter. A Journal of Mathematical Programming and Operation Research, 62(7), 929-941.
Babaie-Kafaki, S., & Mahdavi-Amiri, N. (2013). Two hybrid conjugate gradient methods based on hybrid secant equation. Mathematical Modeling and Analysis,18(1):32-52.
Babaie-Kafaki, S., & Ghanbari, R. (2014a). The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. European Journal of Operation Research, 234(3), 625-630.
Babaie-Kafaki, S., & Ghanbari, R. (2014b). A descent family of Dai-Liao conjugate gradient methods. Optimization and Methods Software, 29(3): 583-591.
Babaie-Kafaki, S., & Ghanbari, R. (2014c). Two hybrid nonlinear conjugate gradient methods based on modified hybrid secant equation. Journal of Mathematical Programming and Operation Research, 63(7), 1027-1042.
Babaie-Kafaki, S. (2016). On optimality of two adaptive choices for the parameter of Dai- Liao method. Optimization Letters, 10, 1789-1797.
Babaie-Kafaki, S., & Ghanbari, R. (2015). Two optimal Dai-Liao conjugate gradient methods. Optimization, 64(11): 1-11.
Babaie-Kafaki, S., & Ghanbari, R. (2017). Two adaptive Dai-Liao nonlinear conjugate gradient methods. Iranian Journal of Science and Technology Transactions A: Science, 42, 1505-1509.
Dai, Y.H., & Yuan, Y. (1999). A nonlinear conjugate gradient method with a strong global convergence property. SIAM Journal on Optimization,10(1), 177-182.
Dai, Y. H. (2001). New properties of nonlinear conjugate gradient method, Numerical Mathematics, 89(1), 83-98.
Dai, Y.H., & Liao, L.Z. (2001). New conjugacy conditions and related nonlinear conjugate gradient methods. Applied Mathematics and Optimization, 43(1), 87-101.
Dai, Y.H., & Yuan, Y. (2001). An efficient hybrid conjugates gradient method for unconstrained optimization. Annals of Operations Research, 103, 33-47.
Ding, Y., Lushi, E., & Li, Q. (2010). Investigation of quasi-newton methods for unconstrained optimization. International Journal of Computer Application, 29, 48-58.
Djordjevic, S.S. (2017). New hybrid conjugate gradient methods as a convex combination of LS and CD methods. Published by faculty of science and mathematics. University of Nis, Serbia,31(6), 1813-1825.
Djordjevic, S.S. (2018). New hybrid conjugate gradient method as a convex combination of HS and FR conjugate gradient methods. Journal of Applied Mathematics and Computation, 2(9), 366-378.
Djordjevic, S.S. (2019). New hybrid conjugate gradient method as a convex combination of LS and FR conjugate gradient methods. Acta Mathematica Scientia, 39B(1), 214-228.
Dolan, E., & MorÂ´ e, J. (2002). Benchmarking optimization software with performance profiles. Journal of Mathematical Program, 91 (2), 201-213.
Esmaeili H., Rostami M., & Kimiaei M. (2018). Extended Daiâ€“Yuan conjugate gradient strategy for large-scale unconstrained optimization with applications to compressive sensing. Published by faculty of sciences and mathematics, University of Nis, Serbia, Filomat, 32(6), 2173â€“2191.
Fletcher, R., & Reeves, C. (1964). Function minimization by conjugate gradients. Computational Journal, 7, 149-154.
Fletcher, R. (1987). Practical Methods of Optimization, New York: John Wiley & Sons.
Gilbert, J.C., & Nocedal, J. (1992). Global convergence properties of conjugate gradient methods for optimization. SIAM Journal on Optimization, 2(1), 21-42.
Gould, N.I.M., Orban, D., & Toint, P.L. (2003). CUTEr: a constrained and unconstrained testing environment, revisited. ACM Transactions on Mathematical Software, 29(4), 373-394.
Guo, J., & Wan, Z. (2019). A modified spectral PRP conjugate gradient projection method for solving large-scale monotone equations and its application in compressed sensing, Hindawi Mathematical Problems in Engineering, 2019, 1-17.
Hager, W.W., & Zhang, H. (2006). A survey of nonlinear conjugate gradient methods. Pacific Journal of Optimization, 2, 35-58.
Hestenes, M.R & Stiefel, E.L. (1952). Methods of conjugate gradients for solving linear systems. Journal of Research of National Bureau of Standarts, 49, 409-436.
Ibrahim, A.H., Kumam, P., Abubakar, A. B., Jirakitpuwapat, W., & Abubakar, J. (2020). A hybrid conjugate gradient algorithm for constrained monotone equations with application in compressive sensing, Heliyon, 6, 1-17.
Liu, Y., & Storey, C. (1991). Efficient generalized conjugate gradient algorithms, part 1: Theory. Journal of optimization theory and applications. 69, 129-137.
Liu, J., Du, S., & Chen, Y. (2020). A sufficient descent nonlinear conjugate gradient method for solving M-tensor equations. Journal of Computational and Applied Mathematics, 371, 112-129.
Nocedal, J., & Wright, S. (2006). Numerical Optimization. New York: Springer.
Polyak, B.T. (1967). A general method of solving extremal problems. Soviet Math. Doklady, 8, 593-597.
Powell, M.J.D. (1984). Nonconvex minimization calculations and the conjugate gradient method, Numerical Analysis (Dundee, 1983), D.F. Griffiths, ed., Lecture Notes in Mathematics, 1066, 122-141.
Rao, S.S. (2009). Engineering Optimization Theory and Practice, 4th ed, New Jersey: John Wiley & Sons, Inc.
Sabiu, J., & Waziri, M.Y. (2017). Effective modified hybrid conjugate gradient method for largescale symmetric nonlinear equation. Application and Applied Mathematics, 12, 1036-1056.
Sabiu, J., Waziri, M.Y., & Idris, A. (2017). A new hybrid Dai-Yuan and Hestenes Stiefel conjugate gradient parameter for solving system of equation. Mayfeb Journal of Mathematics, 1, 44-55.
Salihu, N., Odekunle, M., Waziri, M., & Halilu, A. (2020). A new hybrid conjugate gradient method based on secant equation for solving large scale unconstrained optimization problems. Iranian Journal of Optimization, 12(1): 33-44.
Salihu, N., Odekunle, M.R., Saleh, A. M. & Salihu, S. (2021). A Dai-Liao Hybrid Hestenes-Stiefel and Fletcher-Revees Methods for Unconstrained Optimization. International Journal of Industrial Optimization, 2(1), 33-50.
Sun, W., & Yuan, Y.X. (2006). Optimization Theory and Methods: Nonlinear Programming, New York: Springer.
Waziri, M. Y., Ahmed, K., & Sabiâ€™u, J. (2019). A Dai-Liao conjugate gradient method via modified secant equation for system of nonlinear equations. Arabian Journal of Mathematics, 9, 443-457.
Yao, S., Feng, Q., Li, L., & Xu, J. (2019). A class of globally convergent three-term Dai-Liao conjugate gradient methods. Applied Numerical Mathematics, 151, 354-366.
Yuan, Y. X. (1991). A modified BFGS algorithm for unconstrained optimization. IMA Journal of Numerical Analysis, 11, 325-332.
Zoutendijk, G. (1970). Nonlinear programming computational methods, in integer and nonlinear programming. J. Abadie ed., North-Holland, Amsterdam, 3, 37-86.
How to Cite
License and Copyright Agreement
In submitting the manuscript to the journal, the authors certify that:
- They are authorized by their co-authors to enter into these arrangements.
- The work described has not been formally published before, except in the form of an abstract or as part of a published lecture, review, thesis, or overlay journal. Please also carefully read the International Journal of Industrial Optimization (IJIO) Author Guidelines atÂ http://journal2.uad.ac.id/index.php/ijio/about/submissions#onlineSubmissions
- That it is not under consideration for publication elsewhere,
- That its publication has been approved by all the author(s) and by the responsible authorities tacitly or explicitly of the institutes where the work has been carried out.
- They secure the right to reproduce any material that has already been published or copyrighted elsewhere.
- They agree to the following license and copyright agreement.
Authors who publish with the International Journal of Industrial Optimization (IJIO) agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under aÂ Creative Commons Attribution License (CC BY-SA 4.0) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.