Vision and Tactile Robotic System to Grasp Litter in Outdoor Environments

Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10045/138009
Registro completo de metadatos
Registro completo de metadatos
Campo DCValorIdioma
dc.contributorAutomática, Robótica y Visión Artificiales_ES
dc.contributor.authorPáez Ubieta, Ignacio de Loyola-
dc.contributor.authorCastaño Amorós, Julio-
dc.contributor.authorPuente Méndez, Santiago T.-
dc.contributor.authorGil, Pablo-
dc.contributor.otherUniversidad de Alicante. Departamento de Física, Ingeniería de Sistemas y Teoría de la Señales_ES
dc.date.accessioned2023-10-19T10:16:23Z-
dc.date.available2023-10-19T10:16:23Z-
dc.date.issued2023-10-13-
dc.identifier.citationJournal of Intelligent & Robotic Systems. 2023, 109:36. https://doi.org/10.1007/s10846-023-01930-2es_ES
dc.identifier.issn0921-0296 (Print)-
dc.identifier.issn1573-0409 (Online)-
dc.identifier.urihttp://hdl.handle.net/10045/138009-
dc.description.abstractThe accumulation of litter is increasing in many places and is consequently becoming a problem that must be dealt with. In this paper, we present a manipulator robotic system to collect litter in outdoor environments. This system has three functionalities. Firstly, it uses colour images to detect and recognise litter comprising different materials. Secondly, depth data are combined with pixels of waste objects to compute a 3D location and segment three-dimensional point clouds of the litter items in the scene. The grasp in 3 Degrees of Freedom (DoFs) is then estimated for a robot arm with a gripper for the segmented cloud of each instance of waste. Finally, two tactile-based algorithms are implemented and then employed in order to provide the gripper with a sense of touch. This work uses two low-cost visual-based tactile sensors at the fingertips. One of them addresses the detection of contact (which is obtained from tactile images) between the gripper and solid waste, while another has been designed to detect slippage in order to prevent the objects grasped from falling. Our proposal was successfully tested by carrying out extensive experimentation with different objects varying in size, texture, geometry and materials in different outdoor environments (a tiled pavement, a surface of stone/soil, and grass). Our system achieved an average score of 94% for the detection and Collection Success Rate (CSR) as regards its overall performance, and of 80% for the collection of items of litter at the first attempt.es_ES
dc.description.sponsorshipOpen Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature. Research work was funded by the Valencian Regional Government and FEDER through the PROMETEO/2021/075 project. The computer facilities were provided through the IDIFEFER/2020/003 project.es_ES
dc.languageenges_ES
dc.publisherSpringer Naturees_ES
dc.rights© The Author(s) 2023. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.es_ES
dc.subjectLitter detectiones_ES
dc.subjectObject recognitiones_ES
dc.subjectTactile sensinges_ES
dc.subjectTactile learninges_ES
dc.subjectGraspinges_ES
dc.titleVision and Tactile Robotic System to Grasp Litter in Outdoor Environmentses_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.peerreviewedsies_ES
dc.identifier.doi10.1007/s10846-023-01930-2-
dc.relation.publisherversionhttps://doi.org/10.1007/s10846-023-01930-2es_ES
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses_ES
Aparece en las colecciones:INV - AUROVA - Artículos de Revistas

Archivos en este ítem:
Archivos en este ítem:
Archivo Descripción TamañoFormato 
ThumbnailPaez-Ubieta_etal_2023_JIntellRobotSyst.pdf8,81 MBAdobe PDFAbrir Vista previa


Todos los documentos en RUA están protegidos por derechos de autor. Algunos derechos reservados.