Quantifying the Conceptual Error in Dimensionality Reduction
T. Hanika, and J. Hirth. Accepted for publication at ICCS 2021, (2021)cite arxiv:2106.06815, Accepted for publication at ICCS 2021.
Abstract
Dimension reduction of data sets is a standard problem in the realm of
machine learning and knowledge reasoning. They affect patterns in and
dependencies on data dimensions and ultimately influence any decision-making
processes. Therefore, a wide variety of reduction procedures are in use, each
pursuing different objectives. A so far not considered criterion is the
conceptual continuity of the reduction mapping, i.e., the preservation of the
conceptual structure with respect to the original data set. Based on the notion
scale-measure from formal concept analysis we present in this work a) the
theoretical foundations to detect and quantify conceptual errors in data
scalings; b) an experimental investigation of our approach on eleven data sets
that were respectively treated with a variant of non-negative matrix
factorization.
Description
Quantifying the Conceptual Error in Dimensionality Reduction
%0 Journal Article
%1 hanika2021quantifying
%A Hanika, Tom
%A Hirth, Johannes
%D 2021
%J Accepted for publication at ICCS 2021
%K 2021 closure concept continuous dimension fca myown preprint scaling
%T Quantifying the Conceptual Error in Dimensionality Reduction
%U http://arxiv.org/abs/2106.06815
%X Dimension reduction of data sets is a standard problem in the realm of
machine learning and knowledge reasoning. They affect patterns in and
dependencies on data dimensions and ultimately influence any decision-making
processes. Therefore, a wide variety of reduction procedures are in use, each
pursuing different objectives. A so far not considered criterion is the
conceptual continuity of the reduction mapping, i.e., the preservation of the
conceptual structure with respect to the original data set. Based on the notion
scale-measure from formal concept analysis we present in this work a) the
theoretical foundations to detect and quantify conceptual errors in data
scalings; b) an experimental investigation of our approach on eleven data sets
that were respectively treated with a variant of non-negative matrix
factorization.
@article{hanika2021quantifying,
abstract = {Dimension reduction of data sets is a standard problem in the realm of
machine learning and knowledge reasoning. They affect patterns in and
dependencies on data dimensions and ultimately influence any decision-making
processes. Therefore, a wide variety of reduction procedures are in use, each
pursuing different objectives. A so far not considered criterion is the
conceptual continuity of the reduction mapping, i.e., the preservation of the
conceptual structure with respect to the original data set. Based on the notion
scale-measure from formal concept analysis we present in this work a) the
theoretical foundations to detect and quantify conceptual errors in data
scalings; b) an experimental investigation of our approach on eleven data sets
that were respectively treated with a variant of non-negative matrix
factorization.},
added-at = {2021-06-15T20:48:56.000+0200},
author = {Hanika, Tom and Hirth, Johannes},
biburl = {https://www.bibsonomy.org/bibtex/23eecb775db5e27cb692accd96dbf464d/tomhanika},
description = {Quantifying the Conceptual Error in Dimensionality Reduction},
interhash = {65f7fa2dcb15148d8964c1ec67674e0c},
intrahash = {3eecb775db5e27cb692accd96dbf464d},
journal = {Accepted for publication at ICCS 2021},
keywords = {2021 closure concept continuous dimension fca myown preprint scaling},
note = {cite arxiv:2106.06815, Accepted for publication at ICCS 2021},
timestamp = {2021-09-23T12:57:34.000+0200},
title = {Quantifying the Conceptual Error in Dimensionality Reduction},
url = {http://arxiv.org/abs/2106.06815},
year = 2021
}