Conical averagedness and convergence analysis of fixed point algorithms
- Authors: Bartz, Sedi , Dao, Minh , Phan, Hung
- Date: 2022
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 82, no. 2 (2022), p. 351-373
- Full Text:
- Reviewed:
- Description: We study a conical extension of averaged nonexpansive operators and the role it plays in convergence analysis of fixed point algorithms. Various properties of conically averaged operators are systematically investigated, in particular, the stability under relaxations, convex combinations and compositions. We derive conical averagedness properties of resolvents of generalized monotone operators. These properties are then utilized in order to analyze the convergence of the proximal point algorithm, the forward–backward algorithm, and the adaptive Douglas–Rachford algorithm. Our study unifies, improves and casts new light on recent studies of these topics. © 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
An adaptive splitting algorithm for the sum of two generalized monotone operators and one cocoercive operator
- Authors: Dao, Minh , Phan, Hung
- Date: 2021
- Type: Text , Journal article
- Relation: Fixed Point Theory and Algorithms for Sciences and Engineering Vol. 2021, no. 1 (2021), p.
- Full Text:
- Reviewed:
- Description: Splitting algorithms for finding a zero of sum of operators often involve multiple steps which are referred to as forward or backward steps. Forward steps are the explicit use of the operators and backward steps involve the operators implicitly via their resolvents. In this paper, we study an adaptive splitting algorithm for finding a zero of the sum of three operators. We assume that two of the operators are generalized monotone and their resolvents are computable, while the other operator is cocoercive but its resolvent is missing or costly to compute. Our splitting algorithm adapts new parameters to the generalized monotonicity of the operators and, at the same time, combines appropriate forward and backward steps to guarantee convergence to a solution of the problem. © 2021, The Author(s).
Computing the resolvent of the sum of operators with application to best approximation problems
- Authors: Dao, Minh , Phan, Hung
- Date: 2019
- Type: Text , Journal article
- Relation: Optimization letters Vol. 14, no. 5 (2019), p. 1193-1205
- Full Text: false
- Reviewed:
- Description: We propose a flexible approach for computing the resolvent of the sum of weakly monotone operators in real Hilbert spaces. This relies on splitting methods where strong convergence is guaranteed. We also prove linear convergence under Lipschitz continuity assumption. The approach is then applied to computing the proximity operator of the sum of weakly convex functions, and particularly to finding the best approximation to the intersection of convex sets.
On Slater’s condition and finite convergence of the Douglas–Rachford algorithm for solving convex feasibility problems in Euclidean spaces
- Authors: Bauschke, Heinz , Dao, Minh , Noll, Dominikus , Phan, Hung
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of global optimization Vol. 65, no. 2 (2016), p. 329-349
- Full Text: false
- Reviewed:
- Description: The Douglas–Rachford algorithm is a classical and very successful method for solving optimization and feasibility problems. In this paper, we provide novel conditions sufficient for finite convergence in the context of convex feasibility problems. Our analysis builds upon, and considerably extends, pioneering work by Spingarn. Specifically, we obtain finite convergence in the presence of Slater’s condition in the affine-polyhedral and in a hyperplanar-epigraphical case. Various examples illustrate our results. Numerical experiments demonstrate the competitiveness of the Douglas–Rachford algorithm for solving linear equations with a positivity constraint when compared to the method of alternating projections and the method of reflection–projection.