Semi-Supervised Dimensionality Reduction by Linear Compression and Stretching




Faculty/Professorship: Smart Environments  
Author(s): Long, Zhiguo; Meng, Hua; Sioutis, Michael
Publisher Information: Bamberg : Otto-Friedrich-Universität
Year of publication: 2022
Pages: 27308-27317
Source/Other editions: IEEE access : practical research, open solutions, 8 (2020), S. 27308-27317 - ISSN: 2169-3536
is version of: 10.1109/ACCESS.2020.2971562
Year of first publication: 2020
Language(s): English
Licence: Creative Commons - CC BY - Attribution 4.0 International 
URN: urn:nbn:de:bvb:473-irb-537896
Abstract: 
Dimensionality reduction is a fundamental and important research topic in the field of machine learning. This paper focuses on a dimensionality reduction technique that exploits semi-supervising information in the form of pairwise constraints; specifically, these constraints specify whether two instances belong to the same class or not. We propose two dual linear methods to accomplish dimensionality reduction under that setting. These two methods overcome the difficulty of maximizing between-class difference and minimizing within-class difference at the same time, by transforming the original data into a new space in such a way that the bi-objective problem is (almost) equivalently reduced to a single objective problem. Empirical evaluations on a broad range of public datasets show that the two proposed methods are superior to several existing methods for semi-supervised dimensionality reduction.
GND Keywords: Lineare Optimierung; Aufgabenanalyse; Cluster-Analyse; Eigenfunktion; Eigenwertverteilung; Hauptkomponentenanalyse; Dimensionsreduktion
Keywords: dimensionality reduction, principal component analysis, eigenvalues, eigenfunctions, task analysis, clustering algorithms, linear programming
DDC Classification: 004 Computer science  
RVK Classification: ST 300   
Type: Article
URI: https://fis.uni-bamberg.de/handle/uniba/53789
Release Date: 13. May 2022

File SizeFormat  
fisba53789.pdf9.29 MBPDFView/Open