TY - GEN
T1 - Transferred correlation learning
T2 - 2010 6th IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 International Joint Conference on Neural Networks, IJCNN 2010
AU - Jiang, Lei
AU - Zhang, Jian
AU - Allen, Gabrielle
PY - 2010
Y1 - 2010
N2 - Transfer learning is a new learning paradigm, in which, besides the training data for the targeted learning task, data that are related to the task (often under a different distribution) are also employed to help train a better learner. For example, out-dated data can be used as such related data. In this paper, we propose a new transfer learning framework for training neural network (NN) ensembles. The framework has two key features: 1) it uses the well-known negative correlation learning to train an ensemble of diverse neural networks from the related data, fully discovering the knowledge in the data; and 2) a penalized incremental learning scheme is used to adapt the neural networks obtained from negative correlation learning to the training data for the targeted learning task. The adaptation is guided by reference neural networks that measure the relatedness between the training and the related data. Experiments on benchmark data sets show that our framework can achieve classification accuracy competitive to existing ensemble transfer learning methods such as TrAdaBoost [1] and TrBagg [2]. We discuss some characteristics of our framework observed in the experiment and the scenarios under which the framework may have superior performance.
AB - Transfer learning is a new learning paradigm, in which, besides the training data for the targeted learning task, data that are related to the task (often under a different distribution) are also employed to help train a better learner. For example, out-dated data can be used as such related data. In this paper, we propose a new transfer learning framework for training neural network (NN) ensembles. The framework has two key features: 1) it uses the well-known negative correlation learning to train an ensemble of diverse neural networks from the related data, fully discovering the knowledge in the data; and 2) a penalized incremental learning scheme is used to adapt the neural networks obtained from negative correlation learning to the training data for the targeted learning task. The adaptation is guided by reference neural networks that measure the relatedness between the training and the related data. Experiments on benchmark data sets show that our framework can achieve classification accuracy competitive to existing ensemble transfer learning methods such as TrAdaBoost [1] and TrBagg [2]. We discuss some characteristics of our framework observed in the experiment and the scenarios under which the framework may have superior performance.
UR - http://www.scopus.com/inward/record.url?scp=79959474660&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=79959474660&partnerID=8YFLogxK
U2 - 10.1109/IJCNN.2010.5596617
DO - 10.1109/IJCNN.2010.5596617
M3 - Conference contribution
AN - SCOPUS:79959474660
SN - 9781424469178
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2010 IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 International Joint Conference on Neural Networks, IJCNN 2010
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 18 July 2010 through 23 July 2010
ER -