Item request has been placed!
×
Item request cannot be made.
×
Processing Request
Low Tensor-Ring Rank Completion by Parallel Matrix Factorization.
Item request has been placed!
×
Item request cannot be made.
×
Processing Request
- Additional Information
- Subject Terms:
- Abstract:
Tensor-ring (TR) decomposition has recently attracted considerable attention in solving the low-rank tensor completion (LRTC) problem. However, due to an unbalanced unfolding scheme used during the update of core tensors, the conventional TR-based completion methods usually require a large TR rank to achieve the optimal performance, which leads to high computational cost in practical applications. To overcome this drawback, we propose a new method to exploit the low TR-rank structure in this article. Specifically, we first introduce a balanced unfolding operation called tensor circular unfolding, by which the relationship between TR rank and the ranks of tensor unfoldings is theoretically established. Using this new unfolding operation, we further propose an algorithm to exploit the low TR-rank structure by performing parallel low-rank matrix factorizations to all circularly unfolded matrices. To tackle the problem of nonuniform missing patterns, we apply a row weighting trick to each circularly unfolded matrix, which significantly improves the adaptive ability to various types of missing patterns. The extensive experiments have demonstrated that the proposed algorithm can achieve outstanding performance using a much smaller TR rank compared with the conventional TR-based completion algorithms; meanwhile, the computational cost is reduced substantially. [ABSTRACT FROM AUTHOR]
- Abstract:
Copyright of IEEE Transactions on Neural Networks & Learning Systems is the property of IEEE and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
No Comments.