### Abstract

We study the problem of low-rank tensor factorization in the presence of missing data. We ask the following question: how many sampled entries do we need, to efficiently and exactly reconstruct a tensor with a low-rank orthogonal decomposition? We propose a novel alternating minimization based method which iteratively refines estimates of the singular vectors. We show that under certain standard assumptions, our method can recover a three-mode n × n × n dimensional rank-r tensor exactly from O(n^{3/2}r^{5} log^{4} n) randomly sampled entries. In the process of proving this result, we solve two challenging sub-problems for tensors with missing data. First, in analyzing the initialization step, we prove a generalization of a celebrated result by Szemerédie et al. on the spectrum of random graphs. We show that this initialization step alone is sufficient to achieve the root mean squared error on the parameters bounded by C(r^{2}n^{3/2}(log n)^{4}|Ω|) from |Ω| observed entries for some constant C independent of n and r. Next, we prove global convergence of alternating minimization with this good initialization. Simulations suggest that the dependence of the sample size on the dimensionality n is indeed tight.

Original language | English (US) |
---|---|

Pages (from-to) | 1431-1439 |

Number of pages | 9 |

Journal | Advances in Neural Information Processing Systems |

Volume | 2 |

Issue number | January |

State | Published - Jan 1 2014 |

Event | 28th Annual Conference on Neural Information Processing Systems 2014, NIPS 2014 - Montreal, Canada Duration: Dec 8 2014 → Dec 13 2014 |

### Fingerprint

### ASJC Scopus subject areas

- Computer Networks and Communications
- Information Systems
- Signal Processing

### Cite this

*Advances in Neural Information Processing Systems*,

*2*(January), 1431-1439.

**Provable tensor factorization with missing data.** / Jain, Prateek; Oh, Sewoong.

Research output: Contribution to journal › Conference article

*Advances in Neural Information Processing Systems*, vol. 2, no. January, pp. 1431-1439.

}

TY - JOUR

T1 - Provable tensor factorization with missing data

AU - Jain, Prateek

AU - Oh, Sewoong

PY - 2014/1/1

Y1 - 2014/1/1

N2 - We study the problem of low-rank tensor factorization in the presence of missing data. We ask the following question: how many sampled entries do we need, to efficiently and exactly reconstruct a tensor with a low-rank orthogonal decomposition? We propose a novel alternating minimization based method which iteratively refines estimates of the singular vectors. We show that under certain standard assumptions, our method can recover a three-mode n × n × n dimensional rank-r tensor exactly from O(n3/2r5 log4 n) randomly sampled entries. In the process of proving this result, we solve two challenging sub-problems for tensors with missing data. First, in analyzing the initialization step, we prove a generalization of a celebrated result by Szemerédie et al. on the spectrum of random graphs. We show that this initialization step alone is sufficient to achieve the root mean squared error on the parameters bounded by C(r2n3/2(log n)4|Ω|) from |Ω| observed entries for some constant C independent of n and r. Next, we prove global convergence of alternating minimization with this good initialization. Simulations suggest that the dependence of the sample size on the dimensionality n is indeed tight.

AB - We study the problem of low-rank tensor factorization in the presence of missing data. We ask the following question: how many sampled entries do we need, to efficiently and exactly reconstruct a tensor with a low-rank orthogonal decomposition? We propose a novel alternating minimization based method which iteratively refines estimates of the singular vectors. We show that under certain standard assumptions, our method can recover a three-mode n × n × n dimensional rank-r tensor exactly from O(n3/2r5 log4 n) randomly sampled entries. In the process of proving this result, we solve two challenging sub-problems for tensors with missing data. First, in analyzing the initialization step, we prove a generalization of a celebrated result by Szemerédie et al. on the spectrum of random graphs. We show that this initialization step alone is sufficient to achieve the root mean squared error on the parameters bounded by C(r2n3/2(log n)4|Ω|) from |Ω| observed entries for some constant C independent of n and r. Next, we prove global convergence of alternating minimization with this good initialization. Simulations suggest that the dependence of the sample size on the dimensionality n is indeed tight.

UR - http://www.scopus.com/inward/record.url?scp=84937903661&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84937903661&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:84937903661

VL - 2

SP - 1431

EP - 1439

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

IS - January

ER -