Situational awareness in search and rescue missions is
key to successful operations, e.g., in collapsed
buildings, underground mine shafts, construction
sites, and underwater caves. LiDAR sensors in
robotics play an increasingly important role in this
context, as do robust and application-specific
algorithms for simultaneous localization and mapping
(SLAM). In many of these scenarios mapping requires
the utilization of a vertically descended scanning
system. This work presents a mobile system designed
to solve this task, including a SLAM approach for
descended LiDAR sensors with small field of view
(FoV), which are in uncontrolled rotation. The SLAM
approach is based on planar polygon matching and is
not limited to the presented scenario. We test the
system by lowering it from a crane inside a tall
building at a fire-fighter school, applying our
offline SLAM approach, and comparing the resulting
point clouds of the environment with ground truth
maps acquired by a terrestrial laser scanner
(TLS). We also compare the SLAM approach to a
state-of-the-art approach with respect to runtime
and accuracy of the resulting maps. Our solution
achieves comparable mapping accuracy at 0.2 of the
runtime
%0 Conference Paper
%1 SSRR2022
%A Arzberger, F.
%A Zevering, J.
%A Bredenbeck, A.
%A Borrmann, D.
%A Nüchter, A.
%B Proceedings of the IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR '22)
%C Seville, Spain
%D 2022
%K imported myown
%P 122--129
%R 10.1109/SSRR56537.2022.10018586
%T Mobile 3D scanning and mapping for freely rotating and vertically descended LiDAR
%U https://robotik.informatik.uni-wuerzburg.de/telematics/download/ssrr2022.pdf
%X Situational awareness in search and rescue missions is
key to successful operations, e.g., in collapsed
buildings, underground mine shafts, construction
sites, and underwater caves. LiDAR sensors in
robotics play an increasingly important role in this
context, as do robust and application-specific
algorithms for simultaneous localization and mapping
(SLAM). In many of these scenarios mapping requires
the utilization of a vertically descended scanning
system. This work presents a mobile system designed
to solve this task, including a SLAM approach for
descended LiDAR sensors with small field of view
(FoV), which are in uncontrolled rotation. The SLAM
approach is based on planar polygon matching and is
not limited to the presented scenario. We test the
system by lowering it from a crane inside a tall
building at a fire-fighter school, applying our
offline SLAM approach, and comparing the resulting
point clouds of the environment with ground truth
maps acquired by a terrestrial laser scanner
(TLS). We also compare the SLAM approach to a
state-of-the-art approach with respect to runtime
and accuracy of the resulting maps. Our solution
achieves comparable mapping accuracy at 0.2 of the
runtime
@inproceedings{SSRR2022,
abstract = {Situational awareness in search and rescue missions is
key to successful operations, e.g., in collapsed
buildings, underground mine shafts, construction
sites, and underwater caves. LiDAR sensors in
robotics play an increasingly important role in this
context, as do robust and application-specific
algorithms for simultaneous localization and mapping
(SLAM). In many of these scenarios mapping requires
the utilization of a vertically descended scanning
system. This work presents a mobile system designed
to solve this task, including a SLAM approach for
descended LiDAR sensors with small field of view
(FoV), which are in uncontrolled rotation. The SLAM
approach is based on planar polygon matching and is
not limited to the presented scenario. We test the
system by lowering it from a crane inside a tall
building at a fire-fighter school, applying our
offline SLAM approach, and comparing the resulting
point clouds of the environment with ground truth
maps acquired by a terrestrial laser scanner
(TLS). We also compare the SLAM approach to a
state-of-the-art approach with respect to runtime
and accuracy of the resulting maps. Our solution
achieves comparable mapping accuracy at 0.2 of the
runtime},
added-at = {2023-01-29T18:26:39.000+0100},
address = {Seville, Spain},
author = {Arzberger, F. and Zevering, J. and Bredenbeck, A. and Borrmann, D. and N{\"u}chter, A.},
biburl = {https://www.bibsonomy.org/bibtex/26495728efcfeaf994423f5d6d48cf92e/nuechter76},
booktitle = {Proceedings of the IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR '22)},
doi = {10.1109/SSRR56537.2022.10018586},
interhash = {b27edc4549a24d42ef228c9835cc8271},
intrahash = {6495728efcfeaf994423f5d6d48cf92e},
keywords = {imported myown},
pages = {122--129},
timestamp = {2024-07-30T17:28:46.000+0200},
title = {{M}obile 3D scanning and mapping for freely rotating and vertically descended {LiDAR}},
url = {https://robotik.informatik.uni-wuerzburg.de/telematics/download/ssrr2022.pdf},
year = 2022
}