Matrix Iteratively Reweighted Least Squares (MatrixIRLS
) for low-rank matrix completion.
This repository contains a MATLAB implementation of the algorithm MatrixIRLS
described in the papers Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method (ICML 2020 Workshop: "Beyond First Order Methods in ML Systems") and A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples (ICML 2021).
MatrixIRLS
is an algorithm that minimizes the sum of logarithms of the singular values of a matrix subject to a entry-wise data constraint, using Iteratively Reweighted Least Squares (IRLS) steps based on an optimal weight operator combined with a suitable smoothing strategy for the objective.
The implementation uses an adaptation of bksvd
or Block Krylov Singular Value Decompostion by Cameron Musco and Christopher Musco to compute singular values and vectors of matrices in a "low-rank + sparse" format.
The repository contains also a collection of reference algorithms for low-rank matrix completion, see list below. In the experimental section of the paper, MatrixIRLS
is compared to these algorithms in terms of data-efficiency (performance for few provided entries) and scalability. We provide the implementations of the reference algorithms for the user's convenience in the folder. These implementations all contain only minor modifications of the authors' original code (to allow for timing experiments). Please refer to the respective research papers and original implementations for a description of standard parameter choices.
If you refer to method, the code or the articles, please cite them as:
@inproceedings{kuemmerlemayrinkverdun2021,
title = {A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples},
author = {K{\"u}mmerle, Christian and Mayrink Verdun, Claudio},
booktitle = {International Conference on Machine Learning (ICML)},
year = {2021}
}
@inproceedings{kuemmerlemayrinkverdun2020,
title={Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method},
author={K{\"u}mmerle, Christian and Mayrink Verdun, Claudio},
booktitle={Workshop on Beyond First Order Methods in ML Systems at the $37^{th}$ International Conference on Machine Learning},
year={2020}
}
- Open MATLAB and run
setup_MatrixIRLS
or, alternatively, add subfolders manually to path.
The main implementation can be found in the folder algorithms/MatrixIRLS
, reference algorithms can be found in subfolders of algorithms
. Some of the algorithms use methods for sparse access of factorized matrices implemented in C compiled as .mex
files, and other auxiliary files contained in the tools
folder.
demo_MatrixIRLS
: This is a minimal example of how to use the main implementationMatrixIRLS.m
.
In the folder examples/
, you find the following example scripts:
-
example_compare_MC_algos_badlycond
: Compares several algorihms for low-rank matrix completion for the completion of badly conditioned matrix with condition number$\kappa=:\frac{\sigma_1(\textbf{X}_0)}{\sigma_r(\textbf{X}_0)} = 10^6$ . For this instance, it calculates reconstructions of the algorithms, and visualizes the error decay, both per iteration and with respect to algorithmic runtime. -
example_compare_MC_algos_wellcond
: As above, but with a well-conditioned matrix with$\kappa = 2$ . -
example_compare_HMIRLS
: ComparesMatrixIRLS
withHM-IRLS
of [KS18] for a medium-size problem. The algorithm of [KS18] is somewhat similar, butMatrixIRLS
exhibits improved scalability by orders of magnitude. -
example_compare_IRLS_various
: Compares the iteratively reweighted least squares algorithmsMatrixIRLS
, a similar fast implementation of IRLS with suboptimal weight operator'arithmetic'
, and algorithmsIRLS-col
andIRLS-row
based on [FRW11], using column and row reweighting, respectively. -
example_compare_MatrixIRLS_IRucLq_sIRLS
: ComparesMatrixIRLS
with the IRLS variantsIRLS-p
andsIRLS-p
of [MF12] andIRucLq
andtIRucLq
of [LXY13] (the latter solves an unconstrained formulation, minimizing a rank suggorate + data fit term). Illustrates advantage ofMatrixIRLS
in terms of speed and accuracy.
In the folder experiments/
, you find the scripts reproducing the experiments of the ICML 2021 paper.
We gratefully acknowledge the authors of the following matrix completion algorithms. For re-use of the algorithms the subfolders of algorithms/
with the exception of MatrixIRLS
, please refer to the provided links and contact the authors if the respective terms of usage are unclear.
ASD
(Alternating Steepest Descent) andScaledASD
(Scaled Alternating Steepest Descent) by Jared Tanner and Ke Wei [TW16], available at Ke Wei's website.NIHT
(Normalized Iterative Hard Thresholding [TW12]) andCGIHT
(Conjugate Gradient Iterative Hard Thresholding [BTW15]) by Jeffrey Blanchard, Jared Tanner and Ke Wei, available at Ke Wei's website.LMaFit
(Low-rank Matrix Fitting algorithm [WYZ12]) by Zaiwen Wen, Wotao Yin, and Yin Zhang, available here.LRGeomCG
(Low-rank matrix completion by Geometric CG [V13]) by Bart Vandereycken, available at Bart Vandereycken's website.LRGeomCG Pursuit
(also called Riemannian Pursuit) [TTWVP14], [UV15], a Riemannian optimization method using adaptive updates of the rank of the fixed-rank manifold used in the optimization.MatrixIRLS
(Matrix Iteratively Reweighted Least Squares). This paper/repository.HM-IRLS
(Harmonic Mean Iteratively Reweighted Least Squares [KS18]) andAM-IRLS
by Christian Kümmerle and Juliane Sigl, available here.IRLS-p
andsIRLS-p
: IRLS with weight operator acting on row space only, solving linear systems by gradient descent [MF12], by Karthik Mohan and Maryam Fazel, available at Maryam Fazel's website.IRucLq
andtIRucLq
((truncated) Iterative Reweighted unconstrained Lq for low-rank matrix recovery [LXY13]) by Zaiwen Wen, Wotao Yin and Yin Zhang, available at Yangyang Xu's website.IRLS-col
andIRLS-row
: IRLS with weight operators that act on the column or row space, respectively, and thus very similar to algorithms of [FRW11] and [MF12]. Main purpose: illustrate the influence of the choice of weight operator.R2RILS
(Rank 2r Iterative Least Squares [BNZ21]) by Jonathan Bauch and Boaz Nadler, available (here and here), see also here.R3MC
(Riemannian three-factor algorithm for low-rank matrix completion [MS14]) by Bamdev Mishra and Rodolphe Sepulchre, available at Bamdev Mishra's website. We also includedR3MC-rankupd
, a variant ofR3MC
which optimizes on fixed-rank manifolds with increasing rank (see also [MS14]).RTRMC
(Riemannian trust-region method for low-rank matrix completion [BA15]) by Nicholas Boumal and Pierre-Antoine Absil, available here. The version provided in this repository uses Manopt 6.0.ScaledGD
(Scaled Gradient Descent [TMC20]) by Tian Tong, Cong Ma and Yuejie Chi. Available here.
- Christian Kümmerle ([email protected])
- Claudio Mayrink Verdun ([email protected])
If you have any problems or questions, please contact us for example via e-mail.
- [KS18] Christian Kümmerle and Juliane Sigl, Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery. J. Mach. Learn. Res., 19(47):1–49, 2018.
- [FRW11] Massimo Fornasier, Holger Rauhut and Rachel Ward, Low-rank matrix recovery via iteratively reweighted least squares minimization. SIAM J. Optim., 21(4):1614–1640, 2011.
- [MF12] Karthik Mohan and Maryam Fazel, Iterative reweighted algorithms for matrix rank minimization, J. Mach. Learn. Res., 13 (1):3441–3473, 2012.
- [LXY13] Ming-Jun Lai, Yangyang Xu and Wotao Yin, Improved iteratively reweighted least squares for unconstrained smoothed $\ell_q$ minimization, SIAM J. Numer. Anal., 51(2):927-957, 2013.
- [TW16] Jared Tanner and Ke Wei, Low rank matrix completion by alternating steepest descent methods. Appl. Comput. Harmon. Anal., 40(2):417–429, 2016.
- [TW12] Jared Tanner and Ke Wei, Normalized Iterative Hard Thresholding for Matrix Completion), SIAM J. Sci. Comput., 35(5):S104–S125, 2012.
- [BTW15] Jeffrey D. Blanchard, Jared Tanner and Ke Wei, CGIHT: conjugate gradient iterative hard thresholding for compressed sensing and matrix completion, Inf. Inference, 4(4):289-327, 2015.
- [WYZ12] Zaiwen Wen, Wotao Yin and Yin Zhang, Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm, Math. Prog. Comp. 4(4):333–361, 2012.
- [V13] Bart Vandereycken, Low-Rank Matrix Completion by Riemannian Optimization, SIAM J. Optim., 23(2):1214-1236, 2013.
- [TTWVP14] Mingkui Tan, Ivor W. Tsang, Li Wang, Bart Vandereycken, Sinno Jialin Pan, Riemannian Pursuit for Big Matrix Recovery, Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1539-1547, 2014.
- [UV15] André Uschmajew, Bart Vandereycken, Greedy rank updates combined with Riemannian descent methods for low-rank optimization, In 2015 International Conference on Sampling Theory and Applications (SampTA) (pp. 420-424). IEEE.
- [BNZ21], Jonathan Bauch and Boaz Nadler, Pini Zilber, Rank 2r iterative least squares: efficient recovery of ill-conditioned low rank matrices from few entries, SIAM J. Math. Data Sci., 3(1):439-465, 2021.
- [MS14] Bamdev Mishra and Rodolphe Sepulchre, R3MC: A Riemannian three-factor algorithm for low-rank matrix completion, In 53rd IEEE Conference on Decision and Control, 1137-1142. IEEE, 2014.
- [BA15] Nicholas Boumal and Pierre-Antoine Absil, Low-rank matrix completion via preconditioned optimization on the Grassmann manifold, Linear Algebra Appl., 15(475):200–239, 2015.
- [TMC20] Tian Tong, Cong Ma and Yuejie Chi, Accelerating Ill-Conditioned Low-Rank Matrix Estimation via Scaled Gradient Descent, arXiv preprint, arXiv:2005.08898, 2020.