Reading List: Papers on Pruning
1990
Yann LeCun, et al. Optimal brain damage. NIPS. 1990.
1993
Babak Hassibi, et al. Second order derivatives for network pruning: Optimal brain surgeon. NIPS. 1993.
2015
Song Han, et al. Learning both weights and connections for efficient neural network. NIPS. 2015.
2016
Hao Zhou, et al. Less is more: Towards compact cnns. ECCV. 2016.
Song Han, et al. “Deep compression: Compressing deep neural networks with pruning, trained quantization and huffman coding.” ICLR. 2016
2017
Tien-Ju Yang, et al. Designing energy-efficient convolutional neural networks using energy-aware pruning. CVPR. 2017
Jian-Hao Luo, et al. Thinet: A filter level pruning method for deep neural network compression. ICCV. 2017
Zhuang Liu, et al. Learning efficient convolutional networks through network slimming. ICCV. 2017.
Yihui He, et al. Channel pruning for accelerating very deep neural networks. ICCV. 2017.
Hao Li, et al. Pruning filters for efficient convnets. ICLR. 2017
Pavlo Molchanov, et al. Pruning Convolutional Neural Networks for Resource Efficient Inference. ICLR. 2017
2019
Zhuang Liu, et al. Rethinking the value of network pruning. ICLR. 2019.
2020
Chaoqi Wang, et al. Picking winning tickets before training by preserving gradient flow. ICLR. 2020
Jonathan Frankle, et al. The early phase of neural network training. ICLR. 2020.
Haoran You, et al. Drawing early-bird tickets: Toward more efficient training of deep networks. ICLR. 2020.
Alex Renda, et al. Comparing rewinding and fine-tuning in neural network pruning. ICLR. 2020.
Jonathan Frankle, et al. Linear Mode Connectivity and the Lottery Ticket Hypothesis. ICML. 2020.
Pedro Savarese, et al. Winning the lottery with continuous sparsification. NIPS. 2020.