UnrealROX+: An Improved Tool for Acquiring Synthetic Data from Virtual 3D Environments

Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10045/114587
Registro completo de metadatos
Registro completo de metadatos
Campo DCValorIdioma
dc.contributor3D Perception Labes_ES
dc.contributor.authorMartínez González, Pablo-
dc.contributor.authorOprea, Sergiu-
dc.contributor.authorCastro-Vargas, John Alejandro-
dc.contributor.authorGarcia-Garcia, Alberto-
dc.contributor.authorOrts-Escolano, Sergio-
dc.contributor.authorGarcia-Rodriguez, Jose-
dc.contributor.authorVincze, Markus-
dc.contributor.otherUniversidad de Alicante. Departamento de Tecnología Informática y Computaciónes_ES
dc.date.accessioned2021-04-28T16:34:37Z-
dc.date.available2021-04-28T16:34:37Z-
dc.date.created2020-
dc.date.issued2021-
dc.identifier.urihttp://hdl.handle.net/10045/114587-
dc.description.abstractSynthetic data generation has become essential in last years for feeding data-driven algorithms, which surpassed traditional techniques performance in almost every computer vision problem. Gathering and labelling the amount of data needed for these data-hungry models in the real world may become unfeasible and error-prone, while synthetic data give us the possibility of generating huge amounts of data with pixel-perfect annotations. However, most synthetic datasets lack from enough realism in their rendered images. In that context UnrealROX generation tool was presented in 2019, allowing to generate highly realistic data, at high resolutions and framerates, with an efficient pipeline based on Unreal Engine, a cutting-edge videogame engine. UnrealROX enabled robotic vision researchers to generate realistic and visually plausible data with full ground truth for a wide variety of problems such as class and instance semantic segmentation, object detection, depth estimation, visual grasping, and navigation. Nevertheless, its workflow was very tied to generate image sequences from a robotic on-board camera, making hard to generate data for other purposes. In this work, we present UnrealROX+, an improved version of UnrealROX where its decoupled and easy-to-use data acquisition system allows to quickly design and generate data in a much more flexible and customizable way. Moreover, it is packaged as an Unreal plug-in, which makes it more comfortable to use with already existing Unreal projects, and it also includes new features such as generating albedo or a Python API for interacting with the virtual environment from Deep Learning frameworks.es_ES
dc.description.sponsorshipSpanish Government PID2019-104818RB-I00 grant for the MoDeaAS project, supported with Feder funds. This work has also been supported by Spanish national grants for PhD studies FPU17/00166,ACIF/2018/197 and UAFPU2019-13. Experiments were made possible by a generous hardware donation from NVIDIA.es_ES
dc.languageenges_ES
dc.rights© Universitat d'Alacant / Universidad de Alicante. Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0)es_ES
dc.subjectSynthetic Dataes_ES
dc.subjectData Generationes_ES
dc.subjectSimulationes_ES
dc.subjectDeep Learninges_ES
dc.subject.otherCiencia de la Computación e Inteligencia Artificiales_ES
dc.titleUnrealROX+: An Improved Tool for Acquiring Synthetic Data from Virtual 3D Environmentses_ES
dc.typesoftwarees_ES
dc.peerreviewednoes_ES
dc.relation.publisherversionhttps://arxiv.org/abs/2104.11776es_ES
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses_ES
dc.relation.projectIDinfo:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/PID2019-104818RB-I00-
dc.relation.projectIDinfo:eu-repo/grantAgreement/MECD//FPU17%2F00166-
dc.rights.holderUniversidad de Alicante-
Aparece en las colecciones:Registro de Programas de Ordenador y Bases de Datos

Archivos en este ítem:
Archivos en este ítem:
Archivo Descripción TamañoFormato 
ThumbnailUnrealRox+.pdfhttps://github.com/3dperceptionlab/unrealrox-plus#unrealrox640,41 kBAdobe PDFAbrir Vista previa


Este ítem está licenciado bajo Licencia Creative Commons Creative Commons