Rethinking Bottleneck Structure for Efficient Mobile Network Design
Daquan Zhou, Qibin Hou, Yunpeng Chen, Jiashi Feng, Shuicheng Yan
;
Abstract
The inverted residual block is dominating architecture design for mobile networks recently. It changes the classic residual bottleneck by introducing two design rules: learning inverted residuals and using linear bottlenecks. In this paper, we rethink the necessity of such design change and find it may bring risks of information loss and gradient confusion. We thus propose to flip the structure and present a novel bottleneck design, called sandglass block, that performs identity mapping and spatial transformation at higher dimensions and thus alleviates information loss and gradient confusion effectively. Extensively experiments demonstrate that, different from the common belief, such bottleneck structure is indeed more beneficial than the inverted ones for mobile networks. In ImageNet classification, by simply replacing the inverted residual block with sandglass block, without increasing parameters and computations, the classification accuracy can be improved by more than 1.7% over MobileNetV2. On Pascal VOC 2007 test set, we observe that there is also 0.9% mAP improvement in object detection. We further verify the effectiveness of sandglass block by adding it into the search space of neural architecture search method DARTS. With 25% parameter reduction, the classification accuracy is improved by 0.13% over previous DARTS models. Code will be made publicly available."
Related Material
[pdf]