Scalable Task and Motion Planning for Multi-Robot Systems in Obstacle-Rich Environments
W. Hönig. Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems, стр. 1746–1751. Richland, SC, International Foundation for Autonomous Agents and Multiagent Systems, (2018)
Аннотация
Motion planning problems have been studied in both the artificial intelligence (AI) and robotics communities. AI solvers can compute plans for hundreds of simple agents in minutes with suboptimality guarantees, while robotics solutions typically include richer kinodynamic models during planning, but are very slow when many robots and obstacles are taken into account. We combine the advantages of the two methods by using a two-step approach. First, we use and extend AI solvers for a simplified coordination problem. The output is a discrete plan that cannot be executed on real robots. Second, we apply a computationally efficient post-processing step that creates a continuous plan, taking kinodynamic constraints into account. We show examples for ground robots in a warehouse domain and quadrotors that are tasked with formation change.
%0 Conference Paper
%1 hoenig2018planning
%A Hönig, Wolfgang
%B Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems
%C Richland, SC
%D 2018
%I International Foundation for Autonomous Agents and Multiagent Systems
%K agent i40 mas motion multi planning robotics system task
%P 1746–1751
%T Scalable Task and Motion Planning for Multi-Robot Systems in Obstacle-Rich Environments
%X Motion planning problems have been studied in both the artificial intelligence (AI) and robotics communities. AI solvers can compute plans for hundreds of simple agents in minutes with suboptimality guarantees, while robotics solutions typically include richer kinodynamic models during planning, but are very slow when many robots and obstacles are taken into account. We combine the advantages of the two methods by using a two-step approach. First, we use and extend AI solvers for a simplified coordination problem. The output is a discrete plan that cannot be executed on real robots. Second, we apply a computationally efficient post-processing step that creates a continuous plan, taking kinodynamic constraints into account. We show examples for ground robots in a warehouse domain and quadrotors that are tasked with formation change.
@inproceedings{hoenig2018planning,
abstract = {Motion planning problems have been studied in both the artificial intelligence (AI) and robotics communities. AI solvers can compute plans for hundreds of simple agents in minutes with suboptimality guarantees, while robotics solutions typically include richer kinodynamic models during planning, but are very slow when many robots and obstacles are taken into account. We combine the advantages of the two methods by using a two-step approach. First, we use and extend AI solvers for a simplified coordination problem. The output is a discrete plan that cannot be executed on real robots. Second, we apply a computationally efficient post-processing step that creates a continuous plan, taking kinodynamic constraints into account. We show examples for ground robots in a warehouse domain and quadrotors that are tasked with formation change.},
added-at = {2020-04-01T14:07:51.000+0200},
address = {Richland, SC},
author = {H\"{o}nig, Wolfgang},
biburl = {https://www.bibsonomy.org/bibtex/248154ce85ac1e3e87a8326e4653b8107/porta},
booktitle = {Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems},
interhash = {2100f20db0214dcb7e30f8af6c751f9c},
intrahash = {48154ce85ac1e3e87a8326e4653b8107},
keywords = {agent i40 mas motion multi planning robotics system task},
location = {Stockholm, Sweden},
numpages = {6},
pages = {1746–1751},
publisher = {International Foundation for Autonomous Agents and Multiagent Systems},
series = {AAMAS ’18},
timestamp = {2020-04-01T14:08:11.000+0200},
title = {Scalable Task and Motion Planning for Multi-Robot Systems in Obstacle-Rich Environments},
year = 2018
}