@godaklumbyte

An automated pick-and-place benchmarking system in robotics

, , , and . 2018 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), page 243-249. (April 2018)
DOI: 10.1109/ICARSC.2018.8374190

Abstract

The reproducibility and objective assessment of a robot's manipulation ability is a crucial topic in robotics. The definition of testing protocols and the automation of such test procedures is challenging. Most evaluation protocols in benchmarks and competitions include not completely standardized interactions of a human/referee and are furthermore influenced by partially subjective criteria across different research areas. A promising and more objective solution is to automate benchmarking protocols for a range of reproducible manipulation tasks. We present an approach that is simplified to the essential elements, in which most of the process and evaluation is automated. The proposed system has low price and effort requirements and allows researchers to compare their robots without placing them side by side. To allow a wide range of robots to be benchmarked all parts will be available open source. In our system, an object is positioned in front of the benchmarked robot by a mobile minirobot that localizes itself on a table according to the individual task's predefined positions. By automating most of the process we ensure that the object locations tested as well as the timings are reproducible and well documented. This approach fosters the objective comparability of robot performances.

Links and resources

Tags