Laplacian embedded regression for scalable manifold regularization
- PMID: 24806762
- DOI: 10.1109/TNNLS.2012.2190420
Laplacian embedded regression for scalable manifold regularization
Abstract
Semi-supervised learning (SSL), as a powerful tool to learn from a limited number of labeled data and a large number of unlabeled data, has been attracting increasing attention in the machine learning community. In particular, the manifold regularization framework has laid solid theoretical foundations for a large family of SSL algorithms, such as Laplacian support vector machine (LapSVM) and Laplacian regularized least squares (LapRLS). However, most of these algorithms are limited to small scale problems due to the high computational cost of the matrix inversion operation involved in the optimization problem. In this paper, we propose a novel framework called Laplacian embedded regression by introducing an intermediate decision variable into the manifold regularization framework. By using ∈-insensitive loss, we obtain the Laplacian embedded support vector regression (LapESVR) algorithm, which inherits the sparse solution from SVR. Also, we derive Laplacian embedded RLS (LapERLS) corresponding to RLS under the proposed framework. Both LapESVR and LapERLS possess a simpler form of a transformed kernel, which is the summation of the original kernel and a graph kernel that captures the manifold structure. The benefits of the transformed kernel are two-fold: (1) we can deal with the original kernel matrix and the graph Laplacian matrix in the graph kernel separately and (2) if the graph Laplacian matrix is sparse, we only need to perform the inverse operation for a sparse matrix, which is much more efficient when compared with that for a dense one. Inspired by kernel principal component analysis, we further propose to project the introduced decision variable into a subspace spanned by a few eigenvectors of the graph Laplacian matrix in order to better reflect the data manifold, as well as accelerate the calculation of the graph kernel, allowing our methods to efficiently and effectively cope with large scale SSL problems. Extensive experiments on both toy and real world data sets show the effectiveness and scalability of the proposed framework.
Similar articles
-
p -Laplacian Regularization for Scene Recognition.IEEE Trans Cybern. 2019 Aug;49(8):2927-2940. doi: 10.1109/TCYB.2018.2833843. Epub 2018 May 22. IEEE Trans Cybern. 2019. PMID: 29994326
-
A distributed semi-supervised learning algorithm based on manifold regularization using wavelet neural network.Neural Netw. 2019 Oct;118:300-309. doi: 10.1016/j.neunet.2018.10.014. Epub 2018 Nov 14. Neural Netw. 2019. PMID: 31330270
-
Successive overrelaxation for laplacian support vector machine.IEEE Trans Neural Netw Learn Syst. 2015 Apr;26(4):674-683. doi: 10.1109/TNNLS.2014.2320738. IEEE Trans Neural Netw Learn Syst. 2015. PMID: 25961091
-
Enhanced manifold regularization for semi-supervised classification.J Opt Soc Am A Opt Image Sci Vis. 2016 Jun 1;33(6):1207-13. doi: 10.1364/JOSAA.33.001207. J Opt Soc Am A Opt Image Sci Vis. 2016. PMID: 27409451
-
DDA-SKF: Predicting Drug-Disease Associations Using Similarity Kernel Fusion.Front Pharmacol. 2022 Jan 13;12:784171. doi: 10.3389/fphar.2021.784171. eCollection 2021. Front Pharmacol. 2022. PMID: 35095495 Free PMC article. Review.
Cited by
-
Scaling up graph-based semisupervised learning via prototype vector machines.IEEE Trans Neural Netw Learn Syst. 2015 Mar;26(3):444-57. doi: 10.1109/TNNLS.2014.2315526. IEEE Trans Neural Netw Learn Syst. 2015. PMID: 25720002 Free PMC article.
-
Time-Series Laplacian Semi-Supervised Learning for Indoor Localization †.Sensors (Basel). 2019 Sep 7;19(18):3867. doi: 10.3390/s19183867. Sensors (Basel). 2019. PMID: 31500312 Free PMC article.
-
_target localization in wireless sensor networks using online semi-supervised support vector regression.Sensors (Basel). 2015 May 27;15(6):12539-59. doi: 10.3390/s150612539. Sensors (Basel). 2015. PMID: 26024420 Free PMC article.
Publication types
LinkOut - more resources
Full Text Sources