A summary of the modified LeNet-5 architecture used in this work as a base architecture. IC refers to the independent component (Chen et al. 2019) and BN represents batch normalization.
Layer No. . | Layer type . | Input channels . | Output channels . | Kernel size . | Stride . | Activation . | Regularization . |
---|---|---|---|---|---|---|---|
1 | Convolutional | 1 | 6 | 5 | 1 | ReLU | IC/BN |
2 | Max pooling | 6 | 6 | 2 | 2 | ||
3 | Convolutional | 6 | 16 | 5 | 1 | ReLU | IC/BN |
4 | Max pooling | 16 | 16 | 2 | 2 | ||
5 | Convolutional | 16 | 120 | 5 | 1 | ReLU | IC/BN |
5′ | Squeeze layer 5 outputs | ||||||
6 | Fully connected | 120×Down-sampled neuron number | 120 | ReLU | Dropout | ||
7 | Fully connected | 120 | 84 | ReLU | Dropout | ||
8 | Fully connected | 84 | 2 | Softmax |
Layer No. . | Layer type . | Input channels . | Output channels . | Kernel size . | Stride . | Activation . | Regularization . |
---|---|---|---|---|---|---|---|
1 | Convolutional | 1 | 6 | 5 | 1 | ReLU | IC/BN |
2 | Max pooling | 6 | 6 | 2 | 2 | ||
3 | Convolutional | 6 | 16 | 5 | 1 | ReLU | IC/BN |
4 | Max pooling | 16 | 16 | 2 | 2 | ||
5 | Convolutional | 16 | 120 | 5 | 1 | ReLU | IC/BN |
5′ | Squeeze layer 5 outputs | ||||||
6 | Fully connected | 120×Down-sampled neuron number | 120 | ReLU | Dropout | ||
7 | Fully connected | 120 | 84 | ReLU | Dropout | ||
8 | Fully connected | 84 | 2 | Softmax |
A summary of the modified LeNet-5 architecture used in this work as a base architecture. IC refers to the independent component (Chen et al. 2019) and BN represents batch normalization.
Layer No. . | Layer type . | Input channels . | Output channels . | Kernel size . | Stride . | Activation . | Regularization . |
---|---|---|---|---|---|---|---|
1 | Convolutional | 1 | 6 | 5 | 1 | ReLU | IC/BN |
2 | Max pooling | 6 | 6 | 2 | 2 | ||
3 | Convolutional | 6 | 16 | 5 | 1 | ReLU | IC/BN |
4 | Max pooling | 16 | 16 | 2 | 2 | ||
5 | Convolutional | 16 | 120 | 5 | 1 | ReLU | IC/BN |
5′ | Squeeze layer 5 outputs | ||||||
6 | Fully connected | 120×Down-sampled neuron number | 120 | ReLU | Dropout | ||
7 | Fully connected | 120 | 84 | ReLU | Dropout | ||
8 | Fully connected | 84 | 2 | Softmax |
Layer No. . | Layer type . | Input channels . | Output channels . | Kernel size . | Stride . | Activation . | Regularization . |
---|---|---|---|---|---|---|---|
1 | Convolutional | 1 | 6 | 5 | 1 | ReLU | IC/BN |
2 | Max pooling | 6 | 6 | 2 | 2 | ||
3 | Convolutional | 6 | 16 | 5 | 1 | ReLU | IC/BN |
4 | Max pooling | 16 | 16 | 2 | 2 | ||
5 | Convolutional | 16 | 120 | 5 | 1 | ReLU | IC/BN |
5′ | Squeeze layer 5 outputs | ||||||
6 | Fully connected | 120×Down-sampled neuron number | 120 | ReLU | Dropout | ||
7 | Fully connected | 120 | 84 | ReLU | Dropout | ||
8 | Fully connected | 84 | 2 | Softmax |
This PDF is available to Subscribers Only
View Article Abstract & Purchase OptionsFor full access to this pdf, sign in to an existing account, or purchase an annual subscription.