Semi-Supervised Dimensionality Reduction by Linear Compression and Stretching

Faculty/Professorship: Smart Environments  
Author(s): Long, Zhiguo; Meng, Hua; Sioutis, Michael
Title of the Journal: IEEE access : practical research, open solutions
ISSN: 2169-3536
Publisher Information: New York, NY : IEEE
Year of publication: 2020
Volume: 8
Pages: 27308-27317
Language(s): English
DOI: 10.1109/ACCESS.2020.2971562
Dimensionality reduction is a fundamental and important research topic in the field of machine learning. This paper focuses on a dimensionality reduction technique that exploits semi-supervising information in the form of pairwise constraints; specifically, these constraints specify whether two instances belong to the same class or not. We propose two dual linear methods to accomplish dimensionality reduction under that setting. These two methods overcome the difficulty of maximizing between-class difference and minimizing within-class difference at the same time, by transforming the original data into a new space in such a way that the bi-objective problem is (almost) equivalently reduced to a single objective problem. Empirical evaluations on a broad range of public datasets show that the two proposed methods are superior to several existing methods for semi-supervised dimensionality reduction.
GND Keywords: Lineare Optimierung; Aufgabenanalyse; Cluster-Analyse; Eigenfunktion; Eigenwertverteilung; Hauptkomponentenanalyse; Dimensionsreduktion
Keywords: dimensionality reduction, principal component analysis, eigenvalues, eigenfunctions, task analysis, clustering algorithms, linear programming
DDC Classification: 004 Computer science  
RVK Classification: ST 300   
Type: Article
Release Date: 2. March 2022