Low rank tensor completion is a highly ill-posed inverse problem,particularly when the data model is not accurate, and some sort ofregularization is required in order to solve it. In this article we focus onthe calibration of the data model. For alternating optimization, we observethat existing rank adaption methods do not enable a continuous transitionbetween manifolds of different ranks. We denote this flaw as$extit{instability (under truncation)}$. As a consequence of this flaw,arbitrarily small changes in the singular values of an iterate can havearbitrarily large influence on the further reconstruction. We thereforeintroduce a singular value based regularization to the standard alternatingleast squares (ALS), which is motivated by averaging in micro-steps. We proveits $extit{stability}$ and derive a natural semi-implicit rank adaptionstrategy. We further prove that the standard ALS micro-steps are only stable onmanifolds of fixed ranks, and only around points that have what we define as$extit{internal tensor restricted isometry property iTRIP}$. Finally, weprovide numerical examples that show improvements of the reconstruction qualityup to orders of magnitude in the new Stable ALS Approximation (SALSA) comparedto standard ALS.
展开▼