Fusion of standard and ordinal dropout techniques to regularise deep models
Hits: 1302
- Áreas de investigación:
- Sin categoría
- Año:
- 2024
- Tipo de publicación:
- Artículo
- Palabras clave:
- Deep Learning, Dropout, Ordinal Classification, Ordinal Regression, Convolutional Neural Networks
- Autores:
- Journal:
- Information Fusion
- Volumen:
- 106
- Mes:
- Febrero
- BibTex:
- Nota:
- JCR 2023: 14.7, Position: 2/143 (Q1), Category: COMPUTER SCIENCE, THEORY & METHODS
- Abstract:
- Dropout is a popular regularisation tool for deep neural classifiers, but it is applied regardless of the nature of the classification task: nominal or ordinal. Consequently, the order relation between the class labels of ordinal problems is ignored. In this paper, we propose the fusion of standard dropout and a new dropout methodology for ordinal classification regularising deep neural networks to avoid overfitting and improve generalisation, but taking into account the extra information of the ordinal task, which is exploited to improve performance. The correlation between the outputs of every neuron and the target labels is used to guide the dropout process: the higher the neuron is correlated with the expected labels, the lower its probability of being dropped. Given that randomness also plays a crucial role in the regularisation process, a balancing factor () is also added to the training process to determine the influence of the ordinality with respect to a constant probability, providing a hybrid ordinal regularisation method. An extensive battery of experiments shows that the new hybrid ordinal dropout methodology perform better than standard dropout, obtaining improved results in most evaluation metrics, including not only ordinal metrics but also nominal ones.
- Comentarios:
- JCR 2023: 14.7, Position: 2/143 (Q1), Category: COMPUTER SCIENCE, THEORY & METHODS