TY - JOUR
T1 - Structured learning with constrained conditional models
AU - Chang, Ming Wei
AU - Ratinov, Lev
AU - Roth, Dan
N1 - Funding Information:
Acknowledgements This work was supported by NSF grant NSF SoD-HCER-0613885, DARPA funding under the Bootstrap Learning Program and by MIAS, a DHS-IDS Center for Multimodal Information Access and Synthesis at UIUC.
PY - 2012/9
Y1 - 2012/9
N2 - Making complex decisions in real world problems often involves assigning values to sets of interdependent variables where an expressive dependency structure among these can influence, or even dictate, what assignments are possible. Commonly used models typically ignore expressive dependencies since the traditional way of incorporating non-local dependencies is inefficient and hence leads to expensive training and inference. The contribution of this paper is two-fold. First, this paper presents Constrained Conditional Models (CCMs), a framework that augments linear models with declarative constraints as a way to support decisions in an expressive output space while maintaining modularity and tractability of training. The paper develops, analyzes and compares novel algorithms for CCMs based on HiddenMarkovModels and Structured Perceptron. The proposed CCM framework is also compared to task-tailored models, such as semi-CRFs. Second, we propose CoDL, a constraint-driven learning algorithm, which makes use of constraints to guide semi-supervised learning.We provide theoretical justification for CoDL along with empirical results which show the advantage of using declarative constraints in the context of semi-supervised training of probabilistic models.
AB - Making complex decisions in real world problems often involves assigning values to sets of interdependent variables where an expressive dependency structure among these can influence, or even dictate, what assignments are possible. Commonly used models typically ignore expressive dependencies since the traditional way of incorporating non-local dependencies is inefficient and hence leads to expensive training and inference. The contribution of this paper is two-fold. First, this paper presents Constrained Conditional Models (CCMs), a framework that augments linear models with declarative constraints as a way to support decisions in an expressive output space while maintaining modularity and tractability of training. The paper develops, analyzes and compares novel algorithms for CCMs based on HiddenMarkovModels and Structured Perceptron. The proposed CCM framework is also compared to task-tailored models, such as semi-CRFs. Second, we propose CoDL, a constraint-driven learning algorithm, which makes use of constraints to guide semi-supervised learning.We provide theoretical justification for CoDL along with empirical results which show the advantage of using declarative constraints in the context of semi-supervised training of probabilistic models.
KW - Information extraction
KW - Natural language processing
KW - Semi-supervised learning
UR - http://www.scopus.com/inward/record.url?scp=84865223146&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84865223146&partnerID=8YFLogxK
U2 - 10.1007/s10994-012-5296-5
DO - 10.1007/s10994-012-5296-5
M3 - Article
AN - SCOPUS:84865223146
SN - 0885-6125
VL - 88
SP - 399
EP - 431
JO - Machine Learning
JF - Machine Learning
IS - 3
ER -