American Journal of Applied Mathematics

| Peer-Reviewed |

Conjugate Gradient Methods for Computing Weighted Analytic Center for Linear Matrix Inequalities Using Exact and Quadratic Interpolation Line Searches

Received: 18 October 2019    Accepted: 27 November 2019    Published: 04 January 2020
Views:       Downloads:

Share This Article

Abstract

We study the problem of computing the weighted analytic center for linear matrix inequality constraints. In this paper, we apply conjugate gradient (CG) methods to find the weighted analytic center. CG methods have low memory requirements and strong local and global convergence properties. The methods considered are the classical methods by Hestenes-Stiefel (HS), Fletcher and Reeves (FR), Polak and Ribiere (PR) and a relatively new method by Rivaie, Abashar, Mustafa and Ismail (RAMI). We compare performance of each method on random test problems by observing the number of iterations and time required by the method to find the weighted analytic center for each test problem. We use Newton’s method exact line search and Quadratic Interpolation inexact line search. Our numerical results show that PR is the best method, followed by HS, then RAMI, and then FR. However, PR and HS performed about the same with exact line search. The results also indicate that both line searches work well, but exact line search handles weights better than the inexact line search when some weight is relatively much larger than the other weights. We also find from our results that with Quadratic interpolation line search, FR is more susceptible to jamming phenomenon than both PR and HS.

DOI 10.11648/j.ajam.20200801.11
Published in American Journal of Applied Mathematics (Volume 8, Issue 1, February 2020)
Page(s) 1-10
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Linear Matrix Inequalities, Weighted Analytic Center, Semidefinite Programming, Conjugate Gradient Methods

References
[1] F. Alizadeh, “Interior Point Methods in Semidefinite Programming with Applications to Combinatorial Optimization”, SIAM Journal on Optimization, vol. 5, no. 1, 1995, pp. 13-51.
[2] L. Vandenberghe and S. Boyd, “Semidefinite Programming”, SIAM Review, vol. 38, 1996, pp. 49-95.
[3] F. Alizadeh, J. A. Haeberly and M. Overton, “Primal-Dual Methods for Semidefinite Programming: Convergence Rates, Stability and Numerical Results”, SIAM Journal on Optimization, vol. 8, no. 3, 1988, pp. 746-768.
[4] J. Renegar, “A polynomial-time algorithm, based on Newton’s method, for linear programming, ” Math. Programming, vol. 40, 1988, pp. 59–93.
[5] M. J. Todd, K. C. Toh and R. H. Tuntuncu, “On the Nesterov-Todd direction in semidefinite programming,” SIAM J. Optim., vol. 8, 1998, pp. 769-796.
[6] L. Vandenberghe, S.-P. Boyd and S. Wu, “Determinant Maximization with Linear Matrix Inequality Constraints”, SIAM Journal on Matrix Analysis, vol. 19, no. 2, 1998, pp. 499-533.
[7] I. S. Pressman and S. Jibrin, “A Weighted Analytic Center for Linear Matrix Inequalities”, Journal of Inequalities in Pure and Applied Mathematics, Vol. 2, Issue 3, Article 29, 2002.
[8] S. Jibrin and J. W. Swift, “The Boundary of the Weighted Analytic Center for Linear Matrix inequalities.” Journal of Inequalities in Pure and Applied Mathematics, Vol. 5, Issue 1, Article 14, 2004.
[9] J. Machacek and S. Jibrin, “An Interior Point Method for Solving Semidefinite Programs Using Cutting Planes and Weighted Analytic Centers”, Journal of Applied Mathematics, Vol. 2012, Article ID 946893, 21 pages.
[10] D. S. Atkinson and P. M. Vaidya, “A scaling technique for finding the weighted analytic center of a polytope ” Math. Prog., vol. 57, 1992, pp. 163–192.
[11] S. Jibrin, “Computing Weighted Analytic Center for Linear Matrix Inequalities Using Infeasible Newton’s Method”, Journal of Mathematics, vol. 2015, Article ID 456392, 2015.
[12] S. Jibrin, I. Abdullahi, “Search Directions in Infeasible Newton’s method for Computing Weighted Analytic Center for Linear Matrix Inequalities”, Applied and Computational Mathematics, vol. 8, no. 1, 2019, pp. 21-28.
[13] L. Vandenberghe and S. Boyd, “Convex Optimization”, Cambridge University Press, New York 2004.
[14] W. W. Hager, H. Zhang, “A survey of nonlinear conjugate gradient method”, Pacific Journal of Optimization, vol. 2, no. 1, 2006, pp. 35-58.
[15] M. Rivaie, A. Abashar, M. Mamat and M. Ismail, “The convergence properties of a new type of conjugate gradient methods”, Applied Mathematical Sciences, vol. 9, no. 54, 2016, pp. 2671-2682.
[16] I. Abdullahi, R. Ahmad, “Convergence Analysis of a New Conjugate Gradient Method for Unconstrained Optimization”, Applied Mathematical Sciences, vol. 9, no. 140, 2015, pp. 6969-6984.
[17] I. Abdullahi, R. Ahmad, “Global Convergence Analysis of a Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems”, Indian Journal of Science and Technology, vol. 9, no. 41, DOI: 10.17485/ijst/2016/v9i41/90175, 2016.
[18] I. Abdullahi, R. Ahmad, “Global convergence analysis of a new hybrid conjugate gradient method for unconstrained optimization problems”, Malaysian Journal of Fundamental and Applied Sciences, vol. 2, no. 13, 2017, pp. 40-48.
[19] X. Du, J. Liu, “Global convergence of a spectral hs conjugate gradient method”, Procedia Engineering, vol. 15, 2011, pp. 1487-1492.
[20] W. W. Hager, H. Zhang, “A new conjugate gradient method with guaranteed descent and an efficient line search”, JSIAM Journal on Optimization, vol. 16, no. 1, 2005, pp. 170-192.
[21] J. Liu, Y. Feng, “Global convergence of a modified ls nonlinear conjugate gradient method”, Procedia Engineering, vol. 15, 2011, pp. 4357-4361.
[22] Z. Wei, G. Li, L. Qi, “New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems”, Applied Mathematics and Computation, vol. 179, no. 2, 2006, pp. 407-430.
[23] Y. Zhang, H. Zheng, C. Zhang, “Global convergence of a modified PRP conjugate gradient method”, Procedia Engineering, vol. 31, 2012, pp. 986-995.
[24] R. Fletcher, C. M. Reeves, “Function minimization by conjugate gradients”, The computer Journal, vol. 7, no. 2, 1964, pp. 149-154.
[25] B. T. Polyak, “The conjugate gradient method in extremal problems”, USSR Computational Mathematics and Mathematical Physics, vol. 9, no. 4, 1969, pp. 94-112.
[26] M. R. Hestenes, E. Stiefel, “Methods of conjugate gradients for solving linear systems”, Journal of Research of the National Bureau of Standards, vol. 49, 1952, pp. 409-436.
[27] S. Jibrin and I. S. Pressman, “Monte Carlo Algorithms for the Detection of Necessary Linear Matrix Inequalities”, International Journal of Nonlinear Sciences and Numerical Simulation, vol. 2, no. 2, 2001. pp. 139-154.
[28] R. Burden and D. Faires, “Numerical Analysis”, 9th Ed., Addison Wesley, 2011.
[29] G. Zoutendijk, Nonlinear Programming, Computational Methods, in Integer and Nonlinear Programming, J. Abadie, ed., North-Holland, Amsterdam, 1970, pp. 37-86.
[30] E. Polak and G. Ribiere, “Note sur la convergence de directions conjugees”, Rev. Francaise Informat Recherche Opertionelle, 3e Annee 16, 1969, pp. 35-43.
[31] M. J. D. Powell, “Restart procedures of the conjugate gradient method”, Math. Prog., vol. 2, 1977, pp. 241-254.
Author Information
  • Department of Mathematics, Faculty of Science, Federal University, Dutse, Jigawa State, Nigeria

  • Department of Mathematics, Faculty of Science, Federal University, Dutse, Jigawa State, Nigeria

Cite This Article
  • APA Style

    Shafiu Jibrin, Ibrahim Abdullahi. (2020). Conjugate Gradient Methods for Computing Weighted Analytic Center for Linear Matrix Inequalities Using Exact and Quadratic Interpolation Line Searches. American Journal of Applied Mathematics, 8(1), 1-10. https://doi.org/10.11648/j.ajam.20200801.11

    Copy | Download

    ACS Style

    Shafiu Jibrin; Ibrahim Abdullahi. Conjugate Gradient Methods for Computing Weighted Analytic Center for Linear Matrix Inequalities Using Exact and Quadratic Interpolation Line Searches. Am. J. Appl. Math. 2020, 8(1), 1-10. doi: 10.11648/j.ajam.20200801.11

    Copy | Download

    AMA Style

    Shafiu Jibrin, Ibrahim Abdullahi. Conjugate Gradient Methods for Computing Weighted Analytic Center for Linear Matrix Inequalities Using Exact and Quadratic Interpolation Line Searches. Am J Appl Math. 2020;8(1):1-10. doi: 10.11648/j.ajam.20200801.11

    Copy | Download

  • @article{10.11648/j.ajam.20200801.11,
      author = {Shafiu Jibrin and Ibrahim Abdullahi},
      title = {Conjugate Gradient Methods for Computing Weighted Analytic Center for Linear Matrix Inequalities Using Exact and Quadratic Interpolation Line Searches},
      journal = {American Journal of Applied Mathematics},
      volume = {8},
      number = {1},
      pages = {1-10},
      doi = {10.11648/j.ajam.20200801.11},
      url = {https://doi.org/10.11648/j.ajam.20200801.11},
      eprint = {https://download.sciencepg.com/pdf/10.11648.j.ajam.20200801.11},
      abstract = {We study the problem of computing the weighted analytic center for linear matrix inequality constraints. In this paper, we apply conjugate gradient (CG) methods to find the weighted analytic center. CG methods have low memory requirements and strong local and global convergence properties. The methods considered are the classical methods by Hestenes-Stiefel (HS), Fletcher and Reeves (FR), Polak and Ribiere (PR) and a relatively new method by Rivaie, Abashar, Mustafa and Ismail (RAMI). We compare performance of each method on random test problems by observing the number of iterations and time required by the method to find the weighted analytic center for each test problem. We use Newton’s method exact line search and Quadratic Interpolation inexact line search. Our numerical results show that PR is the best method, followed by HS, then RAMI, and then FR. However, PR and HS performed about the same with exact line search. The results also indicate that both line searches work well, but exact line search handles weights better than the inexact line search when some weight is relatively much larger than the other weights. We also find from our results that with Quadratic interpolation line search, FR is more susceptible to jamming phenomenon than both PR and HS.},
     year = {2020}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Conjugate Gradient Methods for Computing Weighted Analytic Center for Linear Matrix Inequalities Using Exact and Quadratic Interpolation Line Searches
    AU  - Shafiu Jibrin
    AU  - Ibrahim Abdullahi
    Y1  - 2020/01/04
    PY  - 2020
    N1  - https://doi.org/10.11648/j.ajam.20200801.11
    DO  - 10.11648/j.ajam.20200801.11
    T2  - American Journal of Applied Mathematics
    JF  - American Journal of Applied Mathematics
    JO  - American Journal of Applied Mathematics
    SP  - 1
    EP  - 10
    PB  - Science Publishing Group
    SN  - 2330-006X
    UR  - https://doi.org/10.11648/j.ajam.20200801.11
    AB  - We study the problem of computing the weighted analytic center for linear matrix inequality constraints. In this paper, we apply conjugate gradient (CG) methods to find the weighted analytic center. CG methods have low memory requirements and strong local and global convergence properties. The methods considered are the classical methods by Hestenes-Stiefel (HS), Fletcher and Reeves (FR), Polak and Ribiere (PR) and a relatively new method by Rivaie, Abashar, Mustafa and Ismail (RAMI). We compare performance of each method on random test problems by observing the number of iterations and time required by the method to find the weighted analytic center for each test problem. We use Newton’s method exact line search and Quadratic Interpolation inexact line search. Our numerical results show that PR is the best method, followed by HS, then RAMI, and then FR. However, PR and HS performed about the same with exact line search. The results also indicate that both line searches work well, but exact line search handles weights better than the inexact line search when some weight is relatively much larger than the other weights. We also find from our results that with Quadratic interpolation line search, FR is more susceptible to jamming phenomenon than both PR and HS.
    VL  - 8
    IS  - 1
    ER  - 

    Copy | Download

  • Sections