Starting from some positive semi-definite $M^{(0)}$, does the sequence
$$ M^{(r+1)} = A U^{(r)} V^{(r)\mathsf{H}} B $$
converge where $A, B$ are positive semi-definite and $U^{(r)}, V^{(r)}$ are left and right singular matrices of $M^{(r)}$?
Computer simulation implies the conjecture is true but I failed to provide a proof.
My attempt: $\lVert M^{(r+1)} - M^{(r)} \rVert = \lVert A (U^{(r)} V^{(r)\mathsf{H}} - U^{(r-1)} V^{(r-1)\mathsf{H}}) B \rVert \le \lVert A \rVert \cdot \lVert U^{(r)} V^{(r)\mathsf{H}} - U^{(r-1)} V^{(r-1)\mathsf{H}} \rVert \cdot \lVert B \rVert$ and it suffices to consider convergence of the middle term. However, when $M$ is rank-deficient, the columns of $U, V$ corresponding to zero singular values can be chosen as arbitrary orthonormal basis, such that $\lVert U^{(r)} V^{(r)\mathsf{H}} - U^{(r-1)} V^{(r-1)\mathsf{H}} \rVert$ may not converge.
Edit: $U^{(r)} V^{(r)\mathsf{H}} = \arg \min_{X^\mathsf{H} X = I} \lVert X - M^{(r)} \rVert_F$ is the orthogonal projection of $M^{(r)}$ onto the Stiefel manifold. Still trying to figure out how this may help...