Sparse linear isotonic models

Sheng Chen, Arindam Banerjee

Research output: Contribution to conferencePaperpeer-review

Abstract

In machine learning and data mining, linear models have been widely used to model the response as parametric linear functions of the predictors. To relax such stringent assumptions made by parametric linear models, additive models consider the response to be a summation of unknown transformations applied on the predictors; in particular, additive isotonic models (AIMs) assume the unknown transformations to be monotone. In this paper, we introduce sparse linear isotonic models (SLIMs) for high-dimensional problems by hybridizing ideas in parametric sparse linear models and AIMs, which enjoy a few appealing advantages over both. In the high-dimensional setting, a two-step algorithm is proposed for estimating the sparse parameters as well as the monotone functions over predictors. Under mild statistical assumptions, we show that the algorithm can accurately estimate the parameters. Promising preliminary experiments are presented to support the theoretical results.

Original languageEnglish (US)
Pages1270-1279
Number of pages10
StatePublished - 2018
Externally publishedYes
Event21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018 - Playa Blanca, Lanzarote, Canary Islands, Spain
Duration: Apr 9 2018Apr 11 2018

Conference

Conference21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018
Country/TerritorySpain
CityPlaya Blanca, Lanzarote, Canary Islands
Period4/9/184/11/18

ASJC Scopus subject areas

  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Sparse linear isotonic models'. Together they form a unique fingerprint.

Cite this