TY - GEN
T1 - Efficient distributed learning with sparsity
AU - Wang, Jialei
AU - KoIar, Mladen
AU - Srebro, Nathan
AU - Zhang, Tong
N1 - Publisher Copyright:
Copyright © 2017 by the authors.
PY - 2017
Y1 - 2017
N2 - We propose a novel, efficient approach for distributed sparse learning with observations randomly partitioned across machines. In each round of the proposed method, worker machines compute the gradient of the loss on local data and the master machine solves a shifted regularized loss minimization problem. After a number of communication rounds that scales only logarithmically with the number of machines, and independent of other parameters of the problem, the proposed approach provably matches the estimation error bound of centralized methods.
AB - We propose a novel, efficient approach for distributed sparse learning with observations randomly partitioned across machines. In each round of the proposed method, worker machines compute the gradient of the loss on local data and the master machine solves a shifted regularized loss minimization problem. After a number of communication rounds that scales only logarithmically with the number of machines, and independent of other parameters of the problem, the proposed approach provably matches the estimation error bound of centralized methods.
UR - http://www.scopus.com/inward/record.url?scp=85048460526&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85048460526&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85048460526
T3 - 34th International Conference on Machine Learning, ICML 2017
SP - 5544
EP - 5563
BT - 34th International Conference on Machine Learning, ICML 2017
PB - International Machine Learning Society (IMLS)
T2 - 34th International Conference on Machine Learning, ICML 2017
Y2 - 6 August 2017 through 11 August 2017
ER -