A novel robotic grasp framework for accurate grasping under complex packaging factory environments

Guirong Dong, Fuqiang Zhang, Xin Li, Zonghui Yang, Dianzi Liu

Research output: Contribution to journalArticlepeer-review

Abstract

As grasping behaviors in real packaging scenarios are apt to be influenced by various disturbances, visual grasping prediction systems have suffered from the poor robustness and low detection accuracy. In this study, an intelligent robotic grasp framework (RTnet) underpinned by a linear global attention mechanism has been proposed to achieve the highly robust robot grasp prediction in real packaging factory scenarios. First, to reduce the computational resources, an optimized linear attention mechanism has been developed in the robotic grasping process. Then, a local window shifting algorithm has been adapted to collect feature information and then integrate global features through the hierarchical design of up and down sampling. To further improve the developed framework with the capability of mitigating noise interference, a self-normalizing feature architecture has been established to empower its robust learning capabilities. Moreover, a grasping dataset in the real operational environment (RealCornell) has been generated to realize a transition to real grasping scenarios. To evaluate the performance of the proposed model, its grasp prediction has been experimentally examined on the Cornell dataset, the RealCornell dataset, and the real scenarios. Results have shown that RTnet has achieved a maximum accuracy of 98.31% on the Cornell dataset and 93.87% on complex RealCornell dataset. Under the consideration of real packaging situations, the proposed model have also demonstrated the high levels of accuracy and robustness in terms of grasping detection. Summarily, RTnet has provided a valuable insight into the advanced deployment and implementation of robotic grasping in the packaging industry.
Original languageEnglish
JournalIEEE Access
Early online date24 Sep 2024
DOIs
Publication statusE-pub ahead of print - 24 Sep 2024

Cite this