Wiley, Journal of Field Robotics, 2(41), p. 327-346, 2023
DOI: 10.1002/rob.22259
Full text: Unavailable
AbstractBoth robot and hand‐eye calibration have been object of research for decades. While current approaches manage to precisely and robustly identify the parameters of a robot's kinematic model, they still rely on external devices such as calibration objects, markers and/or external sensors. Instead of trying to fit recorded measurements to a model of a known object, this paper treats robot calibration as an offline SLAM problem, where scanning poses are linked to a fixed point in space via a moving kinematic chain. As such, we enable robot calibration by using nothing but an arbitrary eye‐in‐hand depth sensor. To the authors' best knowledge the presented framework is the first solution to three‐dimensional (3D) sensor‐based robot calibration that does not require external sensors nor reference objects. Our novel approach utilizes a modified version of the Iterative Corresponding Point algorithm to run bundle adjustment on multiple 3D recordings estimating the optimal parameters of the kinematic model. A detailed evaluation of the system is shown on a real robot with various attached 3D sensors. The presented results show that the system reaches precision comparable to a dedicated external tracking system at a fraction of its cost.