IGNORE: Information Gap-based False Negative Loss Rejection for Single Positive Multi-Label Learning

Gyeong Ryeol Song, Noo-ri Kim, Jin-Seop Lee, Jee-Hyong Lee* ;

Abstract


"Single Positive Multi-Label Learning (SPML) is a method for a scarcely annotated setting, in which each image is assigned only one positive label while the other labels remain unannotated. Most approaches for SPML assume unannotated labels as negatives (“Assumed Negative”, AN). However, with this assumption, some positive labels are inevitably regarded as negative (false negative), resulting in model performance degradation. Therefore, identifying false negatives is the most important with AN assumption. Previous approaches identified false negative labels using the model outputs of assumed negative labels. However, models were trained with noisy negative labels, their outputs were not reliable. Therefore, it is necessary to consider effectively utilizing the most reliable information in SPML for identifying false negative labels. In this paper, we propose the Information Gap-based False Negative LOss REjection (IGNORE) method for SPML. We generate the masked image that all parts are removed except for the discriminative area of the single positive label. It is reasonable that when there is no information of an object in the masked image, the model’s logit for that object is low. Based on this intuition, we identify the false negative labels if they have a significant model’s logit gap between the masked image and the original image. Also, by rejecting false negatives in the model training, we can prevent the model from being biased to false negative labels, and build more reliable models. We evaluate our method on four datasets: Pascal VOC 2012, MS COCO, NUSWIDE, and CUB. Compared to previous state-of-the-art methods in SPML, our method outperforms them on most of the datasets."

Related Material


[pdf] [supplementary material] [DOI]