Non-convex matrix sensing: Breaking the quadratic rank barrier within the pattern complexity

0
11
Non-convex matrix sensing: Breaking the quadratic rank barrier within the pattern complexity



arXiv:2408.13276v1 Announce Sort: new
Summary: For the issue of reconstructing a low-rank matrix from a number of linear measurements, two lessons of algorithms have been broadly studied within the literature: convex approaches primarily based on nuclear norm minimization, and non-convex approaches that use factorized gradient descent. Below sure statistical mannequin assumptions, it’s recognized that nuclear norm minimization recovers the bottom fact as quickly because the variety of samples scales linearly with the variety of levels of freedom of the ground-truth. In distinction, whereas non-convex approaches are computationally inexpensive, present restoration ensures assume that the variety of samples scales a minimum of quadratically with the rank $r$ of the ground-truth matrix. On this paper, we shut this hole by displaying that the non-convex approaches might be as environment friendly as nuclear norm minimization by way of pattern complexity. Particularly, we take into account the issue of reconstructing a constructive semidefinite matrix from a number of Gaussian measurements. We present that factorized gradient descent with spectral initialization converges to the bottom fact with a linear charge as quickly because the variety of samples scales with $ Omega (rdkappa^2)$, the place $d$ is the dimension, and $kappa$ is the situation variety of the bottom fact matrix. This improves the earlier rank-dependence from quadratic to linear. Our proof depends on a probabilistic decoupling argument, the place we present that the gradient descent iterates are solely weakly depending on the person entries of the measurement matrices. We anticipate that our proof approach is of unbiased curiosity for different non-convex issues.



Supply hyperlink

LEAVE A REPLY

Please enter your comment!
Please enter your name here