NAM: Normalization-based Attention Module

Published in ImageNet PPF @ NeurIPS 2021, 2021

Recommended citation: Liu, Y., Shao, Z., Teng, Y., Hoffmann N. (2021). NAM: Normalization-based Attention Module. ImageNet PPF @ NeurIPS 2021. https://arxiv.org/abs/2111.12419

We propose a novel normalization-based attention module (NAM), which suppresses less salient weights. It applies a weight sparsity penalty to the attention modules, thus, making them more computational efficient while retaining similar performance.

Download paper here

Recommended citation: Liu, Y., Shao, Z., Teng, Y., Hoffmann N. (2021). NAM: Normalization-based Attention Module. ImageNet PPF @ NeurIPS 2021.