Print Email Facebook Twitter A convergent relaxation of the Douglas–Rachford algorithm Title A convergent relaxation of the Douglas–Rachford algorithm Author Nguyen, Hieu Thao (TU Delft Team Raf Van de Plas) Date 2018 Abstract This paper proposes an algorithm for solving structured optimization problems, which covers both the backward–backward and the Douglas–Rachford algorithms as special cases, and analyzes its convergence. The set of fixed points of the corresponding operator is characterized in several cases. Convergence criteria of the algorithm in terms of general fixed point iterations are established. When applied to nonconvex feasibility including potentially inconsistent problems, we prove local linear convergence results under mild assumptions on regularity of individual sets and of the collection of sets. In this special case, we refine known linear convergence criteria for the Douglas–Rachford (DR) algorithm. As a consequence, for feasibility problem with one of the sets being affine, we establish criteria for linear and sublinear convergence of convex combinations of the alternating projection and the DR methods. These results seem to be new. We also demonstrate the seemingly improved numerical performance of this algorithm compared to the RAAR algorithm for both consistent and inconsistent sparse feasibility problems. Subject Almost averagednessAlternating projection methodCollection of setsDouglas–Rachford methodKrasnoselski–Mann relaxationMetric subregularityPicard iterationRAAR algorithmTransversality To reference this document use: http://resolver.tudelft.nl/uuid:21e0d8ca-9d31-4d54-b6bf-4fa61806ad03 DOI https://doi.org/10.1007/s10589-018-9989-y ISSN 0926-6003 Source Computational Optimization and Applications, 70 (3), 841-863 Part of collection Institutional Repository Document type journal article Rights © 2018 Hieu Thao Nguyen Files PDF Thao2018_Article_AConverg ... oug_1_.pdf 650.59 KB Close viewer /islandora/object/uuid:21e0d8ca-9d31-4d54-b6bf-4fa61806ad03/datastream/OBJ/view