Generalization in Clustering with Unobserved
Features
E. Krupka, and N. Tishby. Advances in Neural Information Processing Systems 18, MIT Press, Cambridge, MA, (2006)
Abstract
We argue that when objects are characterized by many
attributes, clus- tering them on the basis of a
relatively small random subset of these attributes can
capture information on the unobserved attributes as
well. Moreover, we show that under mild technical
conditions, clustering the objects on the basis of such
a random subset performs almost as well as clustering
with the full attribute set. We prove a finite sample
general- ization theorems for this novel learning
scheme that extends analogous results from the
supervised learning setting. The scheme is demonstrated
for collaborative filtering of users with movies rating
as attributes.
%0 Book Section
%1 krupka-generalization-clustering-unobserved-2005
%A Krupka, Eyal
%A Tishby, Naftali
%B Advances in Neural Information Processing Systems 18
%C Cambridge, MA
%D 2006
%E Weiss, Y.
%E Schölkopf, B.
%E Platt, J.
%I MIT Press
%K clustering subspace
%P 683--690
%T Generalization in Clustering with Unobserved
Features
%X We argue that when objects are characterized by many
attributes, clus- tering them on the basis of a
relatively small random subset of these attributes can
capture information on the unobserved attributes as
well. Moreover, we show that under mild technical
conditions, clustering the objects on the basis of such
a random subset performs almost as well as clustering
with the full attribute set. We prove a finite sample
general- ization theorems for this novel learning
scheme that extends analogous results from the
supervised learning setting. The scheme is demonstrated
for collaborative filtering of users with movies rating
as attributes.
@incollection{krupka-generalization-clustering-unobserved-2005,
abstract = {We argue that when objects are characterized by many
attributes, clus- tering them on the basis of a
relatively small random subset of these attributes can
capture information on the unobserved attributes as
well. Moreover, we show that under mild technical
conditions, clustering the objects on the basis of such
a random subset performs almost as well as clustering
with the full attribute set. We prove a finite sample
general- ization theorems for this novel learning
scheme that extends analogous results from the
supervised learning setting. The scheme is demonstrated
for collaborative filtering of users with movies rating
as attributes.},
added-at = {2011-10-20T15:20:36.000+0200},
address = {Cambridge, MA},
author = {Krupka, Eyal and Tishby, Naftali},
biburl = {https://www.bibsonomy.org/bibtex/2077bdfb9268f818d64749437b5979742/mhwombat},
booktitle = {Advances in Neural Information Processing Systems 18},
editor = {Weiss, Y. and Sch{\"{o}}lkopf, B. and Platt, J.},
interhash = {34e90a60b4504f416287fb85d9fa17ff},
intrahash = {077bdfb9268f818d64749437b5979742},
keywords = {clustering subspace},
pages = {683--690},
publisher = {MIT Press},
timestamp = {2016-07-12T19:25:30.000+0200},
title = {Generalization in Clustering with Unobserved
Features},
year = 2006
}