TY - GEN
T1 - Learning with structured sparsity
AU - Huang, Junzhou
AU - Zhang, Tong
AU - Metaxas, Dimitris
PY - 2009
Y1 - 2009
N2 - This paper investigates a new learning formulation called structured sparsity, which is a natural extension of the standard sparsity concept in statistical learning and compressive sensing. By allowing arbitrary structures on the feature set, this concept generalizes the group sparsity idea. A general theory is developed for learning with structured sparsity, based on the notion of coding complexity associated with the structure. Moreover, a structured greedy algorithm is proposed to efficiently solve the structured sparsity problem. Experiments demonstrate the advantage of structured sparsity over standard sparsity.
AB - This paper investigates a new learning formulation called structured sparsity, which is a natural extension of the standard sparsity concept in statistical learning and compressive sensing. By allowing arbitrary structures on the feature set, this concept generalizes the group sparsity idea. A general theory is developed for learning with structured sparsity, based on the notion of coding complexity associated with the structure. Moreover, a structured greedy algorithm is proposed to efficiently solve the structured sparsity problem. Experiments demonstrate the advantage of structured sparsity over standard sparsity.
UR - http://www.scopus.com/inward/record.url?scp=70049112529&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=70049112529&partnerID=8YFLogxK
U2 - 10.1145/1553374.1553429
DO - 10.1145/1553374.1553429
M3 - Conference contribution
AN - SCOPUS:70049112529
SN - 9781605585161
T3 - ACM International Conference Proceeding Series
BT - Proceedings of the 26th Annual International Conference on Machine Learning, ICML'09
T2 - 26th Annual International Conference on Machine Learning, ICML'09
Y2 - 14 June 2009 through 18 June 2009
ER -