Abstract
Functional data is ubiquitous in many domains, such as healthcare, social media, manufacturing process, sensor networks, and so on. The goal of function-on-function regression is to build a mapping from functional predictors to functional response. In this article, we propose a novel function-on-function regression model based on mode-sparsity regularization. The main idea is to represent the regression coefficient function between predictor and response as the double expansion of basis functions, and then use a mode-sparsity regularization to automatically filter out irrelevant basis functions for both predictors and responses. The proposed approach is further extended to the tensor version to accommodate multiple functional predictors. While allowing the dimensionality of the regression weight matrix or tensor to be relatively large, the mode-sparsity regularized model facilitates the multi-way shrinking of basis functions for each mode. The proposed mode-sparsity regularization covers a wide spectrum of sparse models for function-on-function regression. The resulting optimization problem is challenging due to the non-smooth property of the mode-sparsity regularization. We develop an efficient algorithm to solve the problem, which works in an iterative update fashion, and converges to the global optimum. Furthermore, we analyze the generalization performance of the proposed method and derive an upper bound for the consistency between the recovered function and the underlying true function. The effectiveness of the proposed approach is verified on benchmark functional datasets in various domains.
Original language | English (US) |
---|---|
Article number | 36 |
Journal | ACM Transactions on Knowledge Discovery from Data |
Volume | 12 |
Issue number | 3 |
DOIs | |
State | Published - Mar 2018 |
Externally published | Yes |
Keywords
- Function-on-function regression
- Mode-sparsity regularization
ASJC Scopus subject areas
- General Computer Science