Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Publication Information:
      Springer Science and Business Media LLC, 2016.
    • Publication Date:
      2016
    • Abstract:
      In this paper, we propose a class of block coordinate proximal gradient (BCPG) methods for solving large-scale nonsmooth separable optimization problems. The proposed BCPG methods are based on the Bregman functions, which may vary at each iteration. These methods include many well-known optimization methods, such as the quasi-Newton method, the block coordinate descent method, and the proximal point method. For the proposed methods, we establish their global convergence properties when the blocks are selected by the Gauss---Seidel rule. Further, under some additional appropriate assumptions, we show that the convergence rate of the proposed methods is R-linear. We also present numerical results for a new BCPG method with variable kernels for a convex problem with separable simplex constraints.
    • ISSN:
      1436-4646
      0025-5610
    • Accession Number:
      10.1007/s10107-015-0969-z
    • Rights:
      CLOSED
    • Accession Number:
      edsair.doi...........321842bd9b1f3b0719640af584e850b6