M. Barz, M. Moniri, M. Weber, and D. Sonntag. Adjunct Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16), Heidelberg, Germany, page 17--20. New York, ACM, (2016)
DOI: 10.1145/2968219.2971459
Abstract
In this paper we describe a multimodal-multisensor annotation tool for physiological computing; for example mobile gesture-based interaction devices or health monitoring devices can be connected. It should be used as an expert authoring tool to annotate multiple video-based sensor streams for domain-specific activities. Resulting datasets can be used as supervised datasets for new machine learning tasks. Our tool provides connectors to commercially available sensor systems (e.g., Intel RealSense F200 3D camera, Leap Motion, and Myo) and a graphical user interface for annotation.
%0 Conference Paper
%1 BarzMoniriEtAl16UBICOMP
%A Barz, Michael
%A Moniri, Mohammad Mehdi
%A Weber, Markus
%A Sonntag, Daniel
%B Adjunct Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16), Heidelberg, Germany
%C New York
%D 2016
%I ACM
%K 01801 acm paper dfki ai sensor data user multimodal interaction tool learn zzz.mmi
%P 17--20
%R 10.1145/2968219.2971459
%T Multimodal Multisensor Activity Annotation Tool
%X In this paper we describe a multimodal-multisensor annotation tool for physiological computing; for example mobile gesture-based interaction devices or health monitoring devices can be connected. It should be used as an expert authoring tool to annotate multiple video-based sensor streams for domain-specific activities. Resulting datasets can be used as supervised datasets for new machine learning tasks. Our tool provides connectors to commercially available sensor systems (e.g., Intel RealSense F200 3D camera, Leap Motion, and Myo) and a graphical user interface for annotation.
%@ 978-1-4503-4462-3
@inproceedings{BarzMoniriEtAl16UBICOMP,
abstract = {In this paper we describe a multimodal-multisensor annotation tool for physiological computing; for example mobile gesture-based interaction devices or health monitoring devices can be connected. It should be used as an expert authoring tool to annotate multiple video-based sensor streams for domain-specific activities. Resulting datasets can be used as supervised datasets for new machine learning tasks. Our tool provides connectors to commercially available sensor systems (e.g., Intel RealSense F200 3D camera, Leap Motion, and Myo) and a graphical user interface for annotation.},
added-at = {2018-03-12T14:57:08.000+0100},
address = {New York},
author = {Barz, Michael and Moniri, Mohammad Mehdi and Weber, Markus and Sonntag, Daniel},
biburl = {https://www.bibsonomy.org/bibtex/2d19cd4e02645aa064a667f93316532f7/flint63},
booktitle = {Adjunct Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16), Heidelberg, Germany},
doi = {10.1145/2968219.2971459},
file = {ACM Digital Library:2016/BarzMoniriEtAl16UBICOMP.pdf:PDF},
groups = {public},
interhash = {32ddc1bb46242c889a6442e6df4ae03b},
intrahash = {d19cd4e02645aa064a667f93316532f7},
isbn = {978-1-4503-4462-3},
keywords = {01801 acm paper dfki ai sensor data user multimodal interaction tool learn zzz.mmi},
pages = {17--20},
publisher = {ACM},
timestamp = {2018-04-16T11:35:29.000+0200},
title = {Multimodal Multisensor Activity Annotation Tool},
username = {flint63},
year = 2016
}