Adding Attentiveness to the Neurons in Recurrent Neural Networks
Pengfei Zhang, Jianru Xue, Cuiling Lan, Wenjun Zeng, Zhanning Gao, Nanning Zheng ; The European Conference on Computer Vision (ECCV), 2018, pp. 135-151
Abstract
Recurrent neural networks (RNNs) are capable of modeling the temporal dynamics of complex sequential information. However, the structures of existing RNN neurons mainly focus on controlling the contributions of current and historical information but do not explore the different importance levels of different elements in an input vector of a time slot. We propose adding a simple yet effective Element-wise-Attention Gate (EleAttG) to an RNN block (e.g., all RNN neurons in a network layer) that empowers the RNN neurons to have the attentiveness capability. For an RNN block, an EleAttG is added to adaptively modulate the input by assigning different levels of importance, i.e., attention, to each element/dimension of the input. We refer to an RNN block equipped with an EleAttG as an EleAtt-RNN block. Specically, the modulation of the input is content adaptive and is performed at ne granularity, being element-wise rather than input-wise. The proposed EleAttG, as an additional fundamental unit, is general and can be applied to any RNN structures, e.g., standard RNN, Long Short-Term Memory (LSTM), or Gated Recurrent Unit (GRU). We demonstrate the effectiveness of the proposed EleAtt-RNN by applying it to the action recognition tasks on both 3D human skeleton data and RGB videos. Experiments show that adding attentiveness through EleAttGs to RNN blocks signicantly boosts the power of RNNs.
Related Material
[pdf] [
bibtex]
@InProceedings{Zhang_2018_ECCV,
author = {Zhang, Pengfei and Xue, Jianru and Lan, Cuiling and Zeng, Wenjun and Gao, Zhanning and Zheng, Nanning},
title = {Adding Attentiveness to the Neurons in Recurrent Neural Networks},
booktitle = {The European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
}