Compelling Advantages
- Smaller CNNs require less communication across servers during distributed training.
- Smaller CNNs require less bandwidth to export a new model from the cloud to an autonomous car.
- Smaller CNNs are more feasible to deploy on FPGAs and other hardware with limited memory.
- SqueezeNet achieves AlexNet-level accuracy on ImageNet with 50x fewer parameters.
- More efficient distributed training
- Less overhead when exporting new models to clients
Architectural design strategies
- Replace 3x3 filters with 1x1 filters
- Decrease the number of input channels to 3x3 filters
- Downsample late in the network so that convolution layers have large activation maps
Methods

Architecture

Other squeeznet details

Experiments


Others
- early layers in the network have large strides, then most layers will have small activation maps.
- delayed downsampling to four different CNN architectures, and in each case delayed downsampling led to higher classification accuracy
相关文章:
-
2021-04-30
-
2022-12-23
-
2022-12-23
-
2022-01-21
-
2022-01-12
-
2021-10-20
-
2021-11-22
-
2021-05-02
猜你喜欢
-
2021-05-25
-
2021-07-19
-
2021-10-22