TY - GEN
T1 - Isometric projection
AU - Cai, Deng
AU - He, Xiaofei
AU - Han, Jiawei
PY - 2007
Y1 - 2007
N2 - Recently the problem of dimensionality reduction has received a lot of interests in many fields of information processing. We consider the case where data is sampled from a low dimensional manifold which is embedded in high dimensional Euclidean space. The most popular manifold learning algorithms include Locally Linear Embedding, ISOMAP, and Laplacian Eigenmap. However, these algorithms are nonlinear and only provide the embedding results of training samples. In this paper, we propose a novel linear dimensionality reduction algorithm, called Isometric Projection. Isometric Projection constructs a weighted data graph where the weights are discrete approximations of the geodesic distances on the data manifold. A linear subspace is then obtained by preserving the pairwise distances. In this way, Isometric Projection can be defined everywhere. Comparing to Principal Component Analysis (PCA) which is widely used in data processing, our algorithm is more capable of discovering the intrinsic geometrical structure. Specially, PCA is optimal only when the data space is linear, while our algorithm has no such assumption and therefore can handle more complex data space. Experimental results on two real life data sets illustrate the effectiveness of the proposed method.
AB - Recently the problem of dimensionality reduction has received a lot of interests in many fields of information processing. We consider the case where data is sampled from a low dimensional manifold which is embedded in high dimensional Euclidean space. The most popular manifold learning algorithms include Locally Linear Embedding, ISOMAP, and Laplacian Eigenmap. However, these algorithms are nonlinear and only provide the embedding results of training samples. In this paper, we propose a novel linear dimensionality reduction algorithm, called Isometric Projection. Isometric Projection constructs a weighted data graph where the weights are discrete approximations of the geodesic distances on the data manifold. A linear subspace is then obtained by preserving the pairwise distances. In this way, Isometric Projection can be defined everywhere. Comparing to Principal Component Analysis (PCA) which is widely used in data processing, our algorithm is more capable of discovering the intrinsic geometrical structure. Specially, PCA is optimal only when the data space is linear, while our algorithm has no such assumption and therefore can handle more complex data space. Experimental results on two real life data sets illustrate the effectiveness of the proposed method.
UR - http://www.scopus.com/inward/record.url?scp=36348971721&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=36348971721&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:36348971721
SN - 1577353234
SN - 9781577353232
T3 - Proceedings of the National Conference on Artificial Intelligence
SP - 528
EP - 533
BT - AAAI-07/IAAI-07 Proceedings
T2 - AAAI-07/IAAI-07 Proceedings: 22nd AAAI Conference on Artificial Intelligence and the 19th Innovative Applications of Artificial Intelligence Conference
Y2 - 22 July 2007 through 26 July 2007
ER -