Loughborough University
Browse

SG-Grasp: semantic segmentation guided robotic grasp oriented to weakly textured objects based on visual perception sensors

Download (3.41 MB)
journal contribution
posted on 2024-02-07, 10:30 authored by Ling Tong, Kechen Song, Hongkun Tian, Yi Man, Yunhui Yan, Qinggang MengQinggang Meng
Weakly textured objects are frequently manipulated by industrial and domestic robots, and the most common two types are transparent and reflective objects; however, their unique visual properties present challenges even for advanced grasp detection algorithms. Many existing algorithms heavily rely on depth information, which is not accurately provided by ordinary red-green-blue and depth (RGB-D) sensors for transparent and reflective objects. To overcome this limitation, we propose an innovative solution that uses semantic segmentation to effectively segment weakly textured objects and guide grasp detection. By using only red-green-blue (RGB) images from RGB-D sensors, our segmentation algorithm (RTSegNet) achieves state-of-The-Art performance on the newly proposed TROSD dataset. Importantly, our method enables robots to grasp transparent and reflective objects without requiring retraining of the grasp detection network (which is trained solely on the Cornell dataset). Real-world robot experiments demonstrate the robustness of our approach in grasping commonly encountered weakly textured objects; furthermore, results obtained from various datasets validate the effectiveness and robustness of our segmentation algorithm. Code and video are available at: https://github.com/meiguiz/SG-Grasp.

Funding

Research on 3D Dynamic Detection Theory and Identification Method for Surface Defects of Large High-temperature Structural Parts

National Natural Science Foundation of China

Find out more...

Chunhui Plan Cooperative Project of Ministry of Education under Grant HZKY20220433

111 Project under Grant B16009

History

School

  • Science

Department

  • Computer Science

Published in

IEEE Sensors Journal

Volume

23

Issue

22

Pages

28430 - 28441

Publisher

IEEE

Version

  • AM (Accepted Manuscript)

Rights holder

© IEEE

Publisher statement

© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Acceptance date

2023-09-30

Publication date

2023-10-09

Copyright date

2023

ISSN

1530-437X

eISSN

1558-1748

Language

  • en

Depositor

Prof Qinggang Meng. Deposit date: 6 February 2024