Dissemin is shutting down on January 1st, 2025

Published in

Wiley, Journal of Field Robotics, 2(41), p. 327-346, 2023

DOI: 10.1002/rob.22259

Links

Tools

Export citation

Search in Google Scholar

Robot self‐calibration using actuated 3D sensors

Journal article published in 2023 by Arne Peters ORCID, Alois C. Knoll ORCID
This paper was not found in any repository, but could be made available legally by the author.
This paper was not found in any repository, but could be made available legally by the author.

Full text: Unavailable

Green circle
Preprint: archiving allowed
Orange circle
Postprint: archiving restricted
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

AbstractBoth robot and hand‐eye calibration have been object of research for decades. While current approaches manage to precisely and robustly identify the parameters of a robot's kinematic model, they still rely on external devices such as calibration objects, markers and/or external sensors. Instead of trying to fit recorded measurements to a model of a known object, this paper treats robot calibration as an offline SLAM problem, where scanning poses are linked to a fixed point in space via a moving kinematic chain. As such, we enable robot calibration by using nothing but an arbitrary eye‐in‐hand depth sensor. To the authors' best knowledge the presented framework is the first solution to three‐dimensional (3D) sensor‐based robot calibration that does not require external sensors nor reference objects. Our novel approach utilizes a modified version of the Iterative Corresponding Point algorithm to run bundle adjustment on multiple 3D recordings estimating the optimal parameters of the kinematic model. A detailed evaluation of the system is shown on a real robot with various attached 3D sensors. The presented results show that the system reaches precision comparable to a dedicated external tracking system at a fraction of its cost.