Research Article Open Access

Analyzing Effect on Residual Learning by Gradual Narrowing Fully-Connected Layer Width and Implementing Inception Block in Convolution Layer

Saurabh Sharma1
  • 1 Department of Computer Science, Sir Padampat Singhania School of Engineering, India

Abstract

Research conducted on the advancement of CNN architecture for computer vision problems focuses on strategically choosing and modifying convolution hyperparameters (kernel, pooling, etc.). However, these research works don't exploit the advantage of employing multi fully-connected layers post the core schema to avail further performance improvements, which have been identified as the first research gap. Studies were also conducted to address the challenges of vanishing gradients in deep networks by employing residual learning via skip connections and lowering model training computational costs using parallel convolution rather than sequential convolution operations by employing inception blocks. These studies also don't discuss in detail the impact of sparing features on feature learning, which has been identified as the second research gap. Diagnosis of infectious patterns in chest X-rays using residual learning is chosen as the problem statement for this study. Results show that ResNet50 architecture achieved improved accuracy by 0.6218% and declined error rate by 2.6326% if gradually narrowing FC layers are employed between core residual learning schema and output layer. Also, independent implementation of inception blocks (google net v2) before skip-connections in ResNet50 architecture boosts accuracy by 0.961% and lowers the error rate by 4.2438%. These performance improvements were achieved without regularization and thus, encourage future work in this direction.

Journal of Computer Science
Volume 18 No. 5, 2022, 339-349

DOI: https://doi.org/10.3844/jcssp.2022.339.349

Submitted On: 12 December 2021 Published On: 24 May 2022

How to Cite: Sharma, S. (2022). Analyzing Effect on Residual Learning by Gradual Narrowing Fully-Connected Layer Width and Implementing Inception Block in Convolution Layer. Journal of Computer Science, 18(5), 339-349. https://doi.org/10.3844/jcssp.2022.339.349

  • 2,419 Views
  • 1,030 Downloads
  • 1 Citations

Download

Keywords

  • Fully-Connected Layer
  • Neuron Layer Width
  • ResNet50
  • Residual Network
  • Skip-Connections
  • Inception Blocks