Vision and Tactile Robotic System to Grasp Litter in Outdoor Environments

Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10045/138009
Información del item - Informació de l'item - Item information
Título: Vision and Tactile Robotic System to Grasp Litter in Outdoor Environments
Autor/es: Páez Ubieta, Ignacio de Loyola | Castaño Amorós, Julio | Puente Méndez, Santiago T. | Gil, Pablo
Grupo/s de investigación o GITE: Automática, Robótica y Visión Artificial
Centro, Departamento o Servicio: Universidad de Alicante. Departamento de Física, Ingeniería de Sistemas y Teoría de la Señal
Palabras clave: Litter detection | Object recognition | Tactile sensing | Tactile learning | Grasping
Fecha de publicación: 13-oct-2023
Editor: Springer Nature
Cita bibliográfica: Journal of Intelligent & Robotic Systems. 2023, 109:36. https://doi.org/10.1007/s10846-023-01930-2
Resumen: The accumulation of litter is increasing in many places and is consequently becoming a problem that must be dealt with. In this paper, we present a manipulator robotic system to collect litter in outdoor environments. This system has three functionalities. Firstly, it uses colour images to detect and recognise litter comprising different materials. Secondly, depth data are combined with pixels of waste objects to compute a 3D location and segment three-dimensional point clouds of the litter items in the scene. The grasp in 3 Degrees of Freedom (DoFs) is then estimated for a robot arm with a gripper for the segmented cloud of each instance of waste. Finally, two tactile-based algorithms are implemented and then employed in order to provide the gripper with a sense of touch. This work uses two low-cost visual-based tactile sensors at the fingertips. One of them addresses the detection of contact (which is obtained from tactile images) between the gripper and solid waste, while another has been designed to detect slippage in order to prevent the objects grasped from falling. Our proposal was successfully tested by carrying out extensive experimentation with different objects varying in size, texture, geometry and materials in different outdoor environments (a tiled pavement, a surface of stone/soil, and grass). Our system achieved an average score of 94% for the detection and Collection Success Rate (CSR) as regards its overall performance, and of 80% for the collection of items of litter at the first attempt.
Patrocinador/es: Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature. Research work was funded by the Valencian Regional Government and FEDER through the PROMETEO/2021/075 project. The computer facilities were provided through the IDIFEFER/2020/003 project.
URI: http://hdl.handle.net/10045/138009
ISSN: 0921-0296 (Print) | 1573-0409 (Online)
DOI: 10.1007/s10846-023-01930-2
Idioma: eng
Tipo: info:eu-repo/semantics/article
Derechos: © The Author(s) 2023. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Revisión científica: si
Versión del editor: https://doi.org/10.1007/s10846-023-01930-2
Aparece en las colecciones:INV - AUROVA - Artículos de Revistas

Archivos en este ítem:
Archivos en este ítem:
Archivo Descripción TamañoFormato 
ThumbnailPaez-Ubieta_etal_2023_JIntellRobotSyst.pdf8,81 MBAdobe PDFAbrir Vista previa


Todos los documentos en RUA están protegidos por derechos de autor. Algunos derechos reservados.