PointNetPGAP-SLC: a 3D LiDAR-based place recognition approach with segment-level consistency training for mobile robots in horticulture
3D LiDAR-based place recognition remains largely underexplored in horticultural environments, which present unique challenges due to their semi-permeable nature to laser beams. This characteristic often results in highly similar LiDAR scans from adjacent rows, leading to descriptor ambiguity and, consequently, compromised retrieval performance. In this work, we address the challenges of 3D LiDAR place recognition in horticultural environments, particularly focusing on inter-row ambiguity by introducing three key contributions: (i) a novel model, PointNetPGAP, which combines the outputs of two statistically-inspired aggregators into a single descriptor; (ii) a Segment-Level Consistency (SLC) model, used exclusively during training to enhance descriptor robustness; and (iii) the HORTO-3DLM dataset, comprising LiDAR sequences from orchards and strawberry fields. Experimental evaluations conducted on the HORTO-3DLM and KITTI Odometry datasets demonstrate that PointNetPGAP outperforms state-of-the-art models, including OverlapTransformer and PointNetVLAD, particularly when the SLC model is applied. These results underscore the model's superiority, especially in horticultural environments, by significantly improving retrieval performance in segments with higher ambiguity. The dataset and the code will be made publicly available at https://github.com/Cybonic/PointNetPGAP-SLC.git
History
School
- Aeronautical, Automotive, Chemical and Materials Engineering
Department
- Aeronautical and Automotive Engineering
Published in
IEEE Robotics and Automation LettersVolume
9Issue
11Pages
10471 - 10478Publisher
Institute of Electrical and Electronics Engineers (IEEE)Version
- AM (Accepted Manuscript)
Rights holder
© IEEEPublisher statement
This accepted manuscript is made available under the Creative Commons Attribution licence (CC BY) under the JISC UK green open access agreement.Acceptance date
2024-09-18Publication date
2024-10-07Copyright date
2024eISSN
2377-3766Publisher version
Language
- en