Few-Shot Symbol Classification via Self-Supervised Learning and Nearest Neighbor

Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10045/131396
Información del item - Informació de l'item - Item information
Título: Few-Shot Symbol Classification via Self-Supervised Learning and Nearest Neighbor
Autor/es: Alfaro-Contreras, María | Ríos-Vila, Antonio | Valero-Mas, Jose J. | Calvo-Zaragoza, Jorge
Grupo/s de investigación o GITE: Reconocimiento de Formas e Inteligencia Artificial
Centro, Departamento o Servicio: Universidad de Alicante. Departamento de Lenguajes y Sistemas Informáticos | Universidad de Alicante. Instituto Universitario de Investigación Informática
Palabras clave: Symbol Classification | Document Image Analysis | Self-Supervised Learning | Few-Shot Classification
Fecha de publicación: 24-ene-2023
Editor: Elsevier
Cita bibliográfica: Pattern Recognition Letters. 2023, 167: 1-8. https://doi.org/10.1016/j.patrec.2023.01.014
Resumen: The recognition of symbols within document images is one of the most relevant steps involved in the Document Analysis field. While current state-of-the-art methods based on Deep Learning are capable of adequately performing this task, they generally require a vast amount of data that has to be manually labeled. In this paper, we propose a self-supervised learning-based method that addresses this task by training a neural-based feature extractor with a set of unlabeled documents and performs the recognition task considering just a few reference samples. Experiments on different corpora comprising music, text, and symbol documents report that the proposal is capable of adequately tackling the task with high accuracy rates of up to 95% in few-shot settings. Moreover, results show that the presented strategy outperforms the base supervised learning approaches trained with the same amount of data that, in some cases, even fail to converge. This approach, hence, stands as a lightweight alternative to deal with symbol classification with few annotated data.
Patrocinador/es: This paper is part of the project I+D+i PID2020-118447RA-I00 (MultiScore), funded by MCIN/AEI/10.13039/501100011033. The first author is supported by grant FPU19/04957 from the Spanish Ministerio de Universidades. The second and third authors are respectively supported by grants ACIF/2021/356 and APOSTD/2020/256 from “Programa I+D+i de la Generalitat Valenciana”.
URI: http://hdl.handle.net/10045/131396
ISSN: 0167-8655 (Print) | 1872-7344 (Online)
DOI: 10.1016/j.patrec.2023.01.014
Idioma: eng
Tipo: info:eu-repo/semantics/article
Derechos: © 2023 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
Revisión científica: si
Versión del editor: https://doi.org/10.1016/j.patrec.2023.01.014
Aparece en las colecciones:INV - GRFIA - Artículos de Revistas

Archivos en este ítem:
Archivos en este ítem:
Archivo Descripción TamañoFormato 
ThumbnailAlfaro-Contreras_etal_2023_PatternRecognLett.pdf2,3 MBAdobe PDFAbrir Vista previa


Este ítem está licenciado bajo Licencia Creative Commons Creative Commons