Abstract
To efficiently support personal ways of desktop usage, we have to unleash the power of implicit metadata thus giving local data a well defined meaning. To achieve this, contextual information across heterogeneous media types, file formats, and applications should be annotated and linked. In this paper we present a light weight system which monitors the file structure and automatically generates semantic metadata based on the user activities. We underpin the utility of extracted metadata by showing how it can be leveraged to enhance conventional full-text desktop search.
Users
Please
log in to take part in the discussion (add own reviews or comments).