Longchuan Niu: Leveraging 3D robotic vision techniques to improve the visual perception of heavy-duty manipulators in challenging application scenarios
In remote handling scenario, the replacement of ITER fusion reactor components requires thousands of tool operations, where the clearance is only 3 mm. For safe and accurate remote handling operations, robust and precise pose estimation of the target objects is required. However, images captured from the scene might be corrupted with acquisition noise due to the harsh environment inside the ITER reactor.
“Consequently, the reconstructed 3D point cloud from stereo images may contain unexpected artefacts, even though the target objects were ITER reactor components whose 3D CAD models were available in advance. However, these components were subject to small drifts in their 6-DOF poses due to extreme heat and high magnetic fields inside the ITER reactor. Thus, a virtual reality representation of the remote handing environment may not reflect the actual scene accurately”, says Longchuan Niu.
Niu’s thesis outlines the precise and reliable navigation of remote handling maintenance operations towards the development of a robust and accurate 3D visual perception system. Due to the constraints of harsh ITER conditions, research methods in such contexts are limited. The related study had to navigate various challenges to improve the robustness of the target object pose estimation technique.
“To achieve this goal, several efforts in designing the stereoscopic visual perception system were presented, such as eye-in-hand configuration, a robust depth estimation method for the accurate 3D reconstruction of target objects using low-resolution grayscale stereo images, a variety of approaches for outlier removal, and a novel edge-point ICP algorithm for robust pose estimation”, Niu explains.
As the research outcome, the conduct of demanding remote handling operations demonstrates that the developed system can cope with the limitations set by a harsh ITER environment, such as image acquisition with low-resolution and grayscale radiation tolerant camera, high level of image noises due to the radiation, non-Lambertian reflectance of reactor elements on shiny metallic surfaces, and deficient illumination of the scene due to constraints on available light sources.
“Overall, the developed visual perception system meets generic ITER requirements and can significantly improve the RH operator experiences. It not only assists the human-operator to locate remote objects quickly and accurately, but also ensures RH tasks to be performed efficiently and safely, as well as reduces operator stress”, Niu says.
In the mining industry scenario, object detection methods that rely on predefined CAD models are infeasible, as rocks do not possess a regular shape or specific surface geometry. Moreover, developing an autonomous rock breaking system requires advanced robotic visual perception capable of instantly detecting and localizing overlapped rocks in a cluttered scene under dynamic outdoor conditions.
The study adopted a ZED stereo camera for its high resolution and compact size. The study also leveraged recent advancements in CNN-based deep learning models, which can aggregate the features of a full RGB image regardless of complexity so that object detection can be performed on a granular and regional level of the image The thesis presents significant work in deploying real-time 3D VPS for autonomous robotic application, which involved data preparation, enhanced training for a deep learning model, proposing and implementing a novel 3D rock detection pipeline, and designing and implementing an innovative rock breaking mechanism. Overall, the proposed robotic VPS meets the requirements for the mining industry with its average rock detection rate (97.61%), real-time performance (11.76 Hz), and capability of autonomous rock breaking without any human intervention. The results offer a clear indication of the technological readiness of such system.
The doctoral dissertation of MSc (Tech) Longchuan Niu in the field of Engineering Science titled Improving the Visual Perception of Heavy Duty Manipulators in Challenging Scenarios will be publicly examined in the Faculty of Engineering and Natural Sciences at Tampere University at 12 o’clock on Friday 4th of December 2020 on Hervanta Campus in Konetalo building, auditorium K1702 (Korkeakoulunkatu 6, Tampere). The Opponent will be Professor Arto Visala from Aalto University. The Custos will be Professor Jouni Mattila.
The event can be followed via remote connection.
Questions from the public are to be sent by text to +358407679904.
The dissertation is available online at the http://urn.fi/URN:ISBN:978-952-03-1797-3