On the Effect of Cross-Channel Normalization on Wide-Shallow Convolutional Networks

Research output: Contribution to journalConference articlepeer-review

Abstract

Cross-channel Normalization (CN) was first proposed in AlexNet paper as a biologically inspired normalization process mimicking the lateral inhibition phenomenon in biological neurons. However, the effect of such a normalization was not well explored in the literature due to the wide popularity of batch normalization. In this paper, we show that such a type of normalization can significantly enhance the network accuracy when applied to wide-shallow convolutional networks. Wide-shallow networks are more advantageous compared to deep CNNs since they significantly increase parallelism and reduce the sequential computation across layers. Our experiments show that using cross-channel normalization not only improves the accuracy of residual and standard convolutional networks on benchmark datasets but also significantly enhances the performance when considering limited and reduced representation examples.

Original languageEnglish
Pages (from-to)154-161
Number of pages8
JournalIET Conference Proceedings
Volume2024
Issue number10
DOIs
Publication statusPublished - 2024
Event26th Irish Machine Vision and Image Processing Conference, IMVIP 2024 - Limerick, Ireland
Duration: 21 Aug 202423 Aug 2024

Keywords

  • Computer Vision
  • Deep Learning
  • Local Response Normalization
  • Wide-Shallow CNNs

Fingerprint

Dive into the research topics of 'On the Effect of Cross-Channel Normalization on Wide-Shallow Convolutional Networks'. Together they form a unique fingerprint.

Cite this