TY - GEN
T1 - Efficient model selection for speech enhancement using a deflation method for Nonnegative Matrix Factorization
AU - Kim, Minje
AU - Smaragdis, Paris
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2014/2/5
Y1 - 2014/2/5
N2 - We present a deflation method for Nonnegative Matrix Factorization (NMF) that aims to discover latent components one by one in order of importance. To do so we perform a series of individual decompositions, each of which stands for a deflation step. In each deflation we obtain a dominant component and a nonnegative residual, and then the residual is further used as an input to the next deflation in case we want to extract more components. With the help of the proposed additional inequality constraint on the residual during the optimization, the accumulated latent components at any given deflation step can approximate the input to some degree, whereas NMF with an inaccurate rank assumption often fail to do so. The proposed method is beneficial if we need efficiency in deciding the model complexity from unknown data. We derive multiplicative update rules similar to those of regular NMF to perform the optimization. Experiments on online speech enhancement show that the proposed deflation method has advantages over NMF: namely a scalable model structure, reusable parameters across decompositions, and resistance to permutation ambiguity.
AB - We present a deflation method for Nonnegative Matrix Factorization (NMF) that aims to discover latent components one by one in order of importance. To do so we perform a series of individual decompositions, each of which stands for a deflation step. In each deflation we obtain a dominant component and a nonnegative residual, and then the residual is further used as an input to the next deflation in case we want to extract more components. With the help of the proposed additional inequality constraint on the residual during the optimization, the accumulated latent components at any given deflation step can approximate the input to some degree, whereas NMF with an inaccurate rank assumption often fail to do so. The proposed method is beneficial if we need efficiency in deciding the model complexity from unknown data. We derive multiplicative update rules similar to those of regular NMF to perform the optimization. Experiments on online speech enhancement show that the proposed deflation method has advantages over NMF: namely a scalable model structure, reusable parameters across decompositions, and resistance to permutation ambiguity.
KW - Blind source separation
KW - Speech enhancement
UR - http://www.scopus.com/inward/record.url?scp=84983119822&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84983119822&partnerID=8YFLogxK
U2 - 10.1109/GlobalSIP.2014.7032175
DO - 10.1109/GlobalSIP.2014.7032175
M3 - Conference contribution
AN - SCOPUS:84983119822
T3 - 2014 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2014
SP - 537
EP - 541
BT - 2014 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2014
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2014 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2014
Y2 - 3 December 2014 through 5 December 2014
ER -