Options
Semi-Supervised Dimensionality Reduction by Linear Compression and Stretching
Long, Zhiguo; Meng, Hua; Sioutis, Michael (2022): Semi-Supervised Dimensionality Reduction by Linear Compression and Stretching, in: Bamberg: Otto-Friedrich-Universität, S. 27308–27317.
Faculty/Chair:
Author:
Publisher Information:
Year of publication:
2022
Pages:
Source/Other editions:
IEEE access : practical research, open solutions, 8 (2020), S. 27308-27317 - ISSN: 2169-3536
Year of first publication:
2020
Language:
English
Abstract:
Dimensionality reduction is a fundamental and important research topic in the field of machine learning. This paper focuses on a dimensionality reduction technique that exploits semi-supervising information in the form of pairwise constraints; specifically, these constraints specify whether two instances belong to the same class or not. We propose two dual linear methods to accomplish dimensionality reduction under that setting. These two methods overcome the difficulty of maximizing between-class difference and minimizing within-class difference at the same time, by transforming the original data into a new space in such a way that the bi-objective problem is (almost) equivalently reduced to a single objective problem. Empirical evaluations on a broad range of public datasets show that the two proposed methods are superior to several existing methods for semi-supervised dimensionality reduction.
GND Keywords: ; ; ; ; ; ;
Lineare Optimierung
Aufgabenanalyse
Cluster-Analyse
Eigenfunktion
Eigenwertverteilung
Hauptkomponentenanalyse
Dimensionsreduktion
Keywords: ; ; ; ; ; ;
dimensionality reduction
principal component analysis
eigenvalues
eigenfunctions
task analysis
clustering algorithms
linear programming
DDC Classification:
RVK Classification:
Type:
Article
Activation date:
May 13, 2022
Permalink
https://fis.uni-bamberg.de/handle/uniba/53789