Guided Saliency Feature Learning for Person Re-identification in Crowded Scenes
Lingxiao He, Wu Liu
;
Abstract
Person Re-identification (Re-ID) in crowed scenes is a challenging problem, where people are frequently partially occluded by objects and other people. However, few studies have provided flexible solutions to re-identifying people in an image containing a partial occlusion body part. In this paper, we propose a simple occlusion-aware approach to address the problem. The proposed method first leverages a fully convolutional network to generate spatial features. And then we design a combination of a pose-guided and mask-guided layer to generate saliency heatmap to further guide discriminative feature learning. More importantly, we propose a new matching approach, called Guided Adaptive Spatial Matching (GASM), which expects that each spatial feature in the query can find the most similar spatial features of a person in a gallery to match. Especially, We use the saliency heatmap to guide the adaptive spatial matching by assigning the foreground human parts with larger weights adaptively. The effectiveness of the proposed GASM is demonstrated on two occluded person datasets: Crowd REID (51.52\%) and Occluded REID (80.25\%) and three benchmark person datasets: Market1501 (95.31\%), DukeMTMC-reID (88.12\%) and MSMT17 (79.52\%). Additionally, GASM achieves good performance on cross-domain person Re-ID. The code and models are available at https://github.com/JDAI-CV/fast-reid/blob/master/projects/CrowdReID."
Related Material
[pdf]