Skip to main content

Table 3 Performance of pruning techniques from the literature in terms of rank-1 accuracy and computational complexity (memory(M): number of parameters and time (T): time required for one forward pass). To ease comparison, we include the out results produced with ThiNet (channel pruning method)

From: Exploiting prunability for person re-identification

Dataset ResNet56 trained on CIFAR10
Algorithm Original Pruned
  rank-1 T M rank-1 T M
L1 [33] 93.04 0.125 0.85 93.06 0.091 0.73
Auto-Balanced [70] 93.93 0.142 N/D 92.94 0.055 N/D
Redundant channel [67] 93.39 0.125 0.85 93.12 0.091 0.65
Play and Prune [69] 93.39 0.125 0.85 93.09 0.039 N/D
FPGM [68] 93.39 0.125 0.85 92.73 0.059 N/D
Dataset VGG16 trained on ImageNet
Algorithm Original Pruned
  rank-1 T M rank-1 T M
ThiNet [71] 90.01 30.94 138.34 89.41 9.58 131.44
Taylor [32] 89.30 30.96 N/D 87.06 11.5 N/D
HaoLi [33] 90.01 30.94 138.34 89.13 9.58 130.87
Channel Pruning [35] 90.01 30.94 138.34 88.10 7.03 131.44
Dataset ResNet50 trained on ImageNet
Algorithm Original Pruned
  rank-1 T M rank-1 T M
Entropy [34] 72.88 3.86 25.56 70.84 2.52 17.38
ThiNet [71] 75.30 7.72 25.56 72.03 3.41 138.00
FPGM [68] 75.30 7.72 25.56 74.83 3.58 N/D