Conjugate Gradient Methods for Computing Weighted Analytic Center for Linear Matrix Inequalities Using Exact and Quadratic Interpolation Line Searches
American Journal of Applied Mathematics
Volume 8, Issue 1, February 2020, Pages: 1-10
Received: Oct. 18, 2019; Accepted: Nov. 27, 2019; Published: Jan. 4, 2020
Views 157      Downloads 82
Authors
Shafiu Jibrin, Department of Mathematics, Faculty of Science, Federal University, Dutse, Jigawa State, Nigeria
Ibrahim Abdullahi, Department of Mathematics, Faculty of Science, Federal University, Dutse, Jigawa State, Nigeria
Article Tools
Follow on us
Abstract
We study the problem of computing the weighted analytic center for linear matrix inequality constraints. In this paper, we apply conjugate gradient (CG) methods to find the weighted analytic center. CG methods have low memory requirements and strong local and global convergence properties. The methods considered are the classical methods by Hestenes-Stiefel (HS), Fletcher and Reeves (FR), Polak and Ribiere (PR) and a relatively new method by Rivaie, Abashar, Mustafa and Ismail (RAMI). We compare performance of each method on random test problems by observing the number of iterations and time required by the method to find the weighted analytic center for each test problem. We use Newton’s method exact line search and Quadratic Interpolation inexact line search. Our numerical results show that PR is the best method, followed by HS, then RAMI, and then FR. However, PR and HS performed about the same with exact line search. The results also indicate that both line searches work well, but exact line search handles weights better than the inexact line search when some weight is relatively much larger than the other weights. We also find from our results that with Quadratic interpolation line search, FR is more susceptible to jamming phenomenon than both PR and HS.
Keywords
Linear Matrix Inequalities, Weighted Analytic Center, Semidefinite Programming, Conjugate Gradient Methods
To cite this article
Shafiu Jibrin, Ibrahim Abdullahi, Conjugate Gradient Methods for Computing Weighted Analytic Center for Linear Matrix Inequalities Using Exact and Quadratic Interpolation Line Searches, American Journal of Applied Mathematics. Vol. 8, No. 1, 2020, pp. 1-10. doi: 10.11648/j.ajam.20200801.11
Copyright
Copyright © 2020 Authors retain the copyright of this article.
This article is an open access article distributed under the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
References
[1]
F. Alizadeh, “Interior Point Methods in Semidefinite Programming with Applications to Combinatorial Optimization”, SIAM Journal on Optimization, vol. 5, no. 1, 1995, pp. 13-51.
[2]
L. Vandenberghe and S. Boyd, “Semidefinite Programming”, SIAM Review, vol. 38, 1996, pp. 49-95.
[3]
F. Alizadeh, J. A. Haeberly and M. Overton, “Primal-Dual Methods for Semidefinite Programming: Convergence Rates, Stability and Numerical Results”, SIAM Journal on Optimization, vol. 8, no. 3, 1988, pp. 746-768.
[4]
J. Renegar, “A polynomial-time algorithm, based on Newton’s method, for linear programming, ” Math. Programming, vol. 40, 1988, pp. 59–93.
[5]
M. J. Todd, K. C. Toh and R. H. Tuntuncu, “On the Nesterov-Todd direction in semidefinite programming,” SIAM J. Optim., vol. 8, 1998, pp. 769-796.
[6]
L. Vandenberghe, S.-P. Boyd and S. Wu, “Determinant Maximization with Linear Matrix Inequality Constraints”, SIAM Journal on Matrix Analysis, vol. 19, no. 2, 1998, pp. 499-533.
[7]
I. S. Pressman and S. Jibrin, “A Weighted Analytic Center for Linear Matrix Inequalities”, Journal of Inequalities in Pure and Applied Mathematics, Vol. 2, Issue 3, Article 29, 2002.
[8]
S. Jibrin and J. W. Swift, “The Boundary of the Weighted Analytic Center for Linear Matrix inequalities.” Journal of Inequalities in Pure and Applied Mathematics, Vol. 5, Issue 1, Article 14, 2004.
[9]
J. Machacek and S. Jibrin, “An Interior Point Method for Solving Semidefinite Programs Using Cutting Planes and Weighted Analytic Centers”, Journal of Applied Mathematics, Vol. 2012, Article ID 946893, 21 pages.
[10]
D. S. Atkinson and P. M. Vaidya, “A scaling technique for finding the weighted analytic center of a polytope ” Math. Prog., vol. 57, 1992, pp. 163–192.
[11]
S. Jibrin, “Computing Weighted Analytic Center for Linear Matrix Inequalities Using Infeasible Newton’s Method”, Journal of Mathematics, vol. 2015, Article ID 456392, 2015.
[12]
S. Jibrin, I. Abdullahi, “Search Directions in Infeasible Newton’s method for Computing Weighted Analytic Center for Linear Matrix Inequalities”, Applied and Computational Mathematics, vol. 8, no. 1, 2019, pp. 21-28.
[13]
L. Vandenberghe and S. Boyd, “Convex Optimization”, Cambridge University Press, New York 2004.
[14]
W. W. Hager, H. Zhang, “A survey of nonlinear conjugate gradient method”, Pacific Journal of Optimization, vol. 2, no. 1, 2006, pp. 35-58.
[15]
M. Rivaie, A. Abashar, M. Mamat and M. Ismail, “The convergence properties of a new type of conjugate gradient methods”, Applied Mathematical Sciences, vol. 9, no. 54, 2016, pp. 2671-2682.
[16]
I. Abdullahi, R. Ahmad, “Convergence Analysis of a New Conjugate Gradient Method for Unconstrained Optimization”, Applied Mathematical Sciences, vol. 9, no. 140, 2015, pp. 6969-6984.
[17]
I. Abdullahi, R. Ahmad, “Global Convergence Analysis of a Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems”, Indian Journal of Science and Technology, vol. 9, no. 41, DOI: 10.17485/ijst/2016/v9i41/90175, 2016.
[18]
I. Abdullahi, R. Ahmad, “Global convergence analysis of a new hybrid conjugate gradient method for unconstrained optimization problems”, Malaysian Journal of Fundamental and Applied Sciences, vol. 2, no. 13, 2017, pp. 40-48.
[19]
X. Du, J. Liu, “Global convergence of a spectral hs conjugate gradient method”, Procedia Engineering, vol. 15, 2011, pp. 1487-1492.
[20]
W. W. Hager, H. Zhang, “A new conjugate gradient method with guaranteed descent and an efficient line search”, JSIAM Journal on Optimization, vol. 16, no. 1, 2005, pp. 170-192.
[21]
J. Liu, Y. Feng, “Global convergence of a modified ls nonlinear conjugate gradient method”, Procedia Engineering, vol. 15, 2011, pp. 4357-4361.
[22]
Z. Wei, G. Li, L. Qi, “New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems”, Applied Mathematics and Computation, vol. 179, no. 2, 2006, pp. 407-430.
[23]
Y. Zhang, H. Zheng, C. Zhang, “Global convergence of a modified PRP conjugate gradient method”, Procedia Engineering, vol. 31, 2012, pp. 986-995.
[24]
R. Fletcher, C. M. Reeves, “Function minimization by conjugate gradients”, The computer Journal, vol. 7, no. 2, 1964, pp. 149-154.
[25]
B. T. Polyak, “The conjugate gradient method in extremal problems”, USSR Computational Mathematics and Mathematical Physics, vol. 9, no. 4, 1969, pp. 94-112.
[26]
M. R. Hestenes, E. Stiefel, “Methods of conjugate gradients for solving linear systems”, Journal of Research of the National Bureau of Standards, vol. 49, 1952, pp. 409-436.
[27]
S. Jibrin and I. S. Pressman, “Monte Carlo Algorithms for the Detection of Necessary Linear Matrix Inequalities”, International Journal of Nonlinear Sciences and Numerical Simulation, vol. 2, no. 2, 2001. pp. 139-154.
[28]
R. Burden and D. Faires, “Numerical Analysis”, 9th Ed., Addison Wesley, 2011.
[29]
G. Zoutendijk, Nonlinear Programming, Computational Methods, in Integer and Nonlinear Programming, J. Abadie, ed., North-Holland, Amsterdam, 1970, pp. 37-86.
[30]
E. Polak and G. Ribiere, “Note sur la convergence de directions conjugees”, Rev. Francaise Informat Recherche Opertionelle, 3e Annee 16, 1969, pp. 35-43.
[31]
M. J. D. Powell, “Restart procedures of the conjugate gradient method”, Math. Prog., vol. 2, 1977, pp. 241-254.
ADDRESS
Science Publishing Group
1 Rockefeller Plaza,
10th and 11th Floors,
New York, NY 10020
U.S.A.
Tel: (001)347-983-5186