Group Fisher Pruning for Practical Network Compression(ICML2021)

Overview

Group Fisher Pruning for Practical Network Compression (ICML2021)

By Liyang Liu*, Shilong Zhang*, Zhanghui Kuang, Jing-Hao Xue, Aojun Zhou, Xinjiang Wang, Yimin Chen, Wenming Yang, Qingmin Liao, Wayne Zhang

Updates

  • All one stage models of Detection has been released (21/6/2021)

NOTES

All models about detection has been released. The classification models will be released later, because we want to refactor all our code into a Hook , so that it can become a more general tool for all tasks in OpenMMLab.

We will continue to improve this method and apply it to more other tasks, such as segmentation and pose.

The layer grouping algorithm is implemtated based on the AutoGrad of Pytorch, If you are not familiar with this feature and you can read Chinese, then these materials may be helpful to you.

  1. AutoGrad in Pytorch

  2. Hook of MMCV

Introduction

1. Compare with state-of-the-arts.

2. Can be applied to various complicated structures and various tasks.

3. Boosting inference speed on GPU under same flops.

Get Started

1. Creat a basic environment with pytorch 1.3.0 and mmcv-full

Due to the frequent changes of the autograd interface, we only guarantee the code works well in pytorch==1.3.0.

  1. Creat the environment
conda create -n open-mmlab python=3.7 -y
conda activate open-mmlab
  1. Install PyTorch 1.3.0 and corresponding torchvision.
conda install pytorch=1.3.0 cudatoolkit=10.0 torchvision=0.2.2 -c pytorch
  1. Build the mmcv-full from source with pytorch 1.3.0 and cuda 10.0

Please use gcc-5.4 and nvcc 10.0

 git clone https://github.com/open-mmlab/mmcv.git
 cd mmcv
 MMCV_WITH_OPS=1 pip install -e .

2. Install the corresponding codebase in OpenMMLab.

e.g. MMdetection

pip install mmdet==2.13.0

3. Pruning the model.

e.g. Detection

cd detection

Modify the load_from as the path to the baseline model in of xxxx_pruning.py

# for slurm train
sh tools/slurm_train.sh PATITION_NAME JOB_NAME configs/retina/retina_pruning.py work_dir
# for slurm_test
sh tools/slurm_test.sh PATITION_NAME JOB_NAME configs/retina/retina_pruning.py PATH_CKPT --eval bbox
# for torch.dist
# sh tools/dist_train.sh configs/retina/retina_pruning.py 8

4. Finetune the model.

e.g. Detection

cd detection

Modify the deploy_from as the path to the pruned model in custom_hooks of xxxx_finetune.py

# for slurm train
sh tools/slurm_train.sh PATITION_NAME JOB_NAME configs/retina/retina_finetune.py work_dir
# for slurm test
sh tools/slurm_test.sh PATITION_NAME JOB_NAME configs/retina/retina_fintune.py PATH_CKPT --eval bbox
# for torch.dist
# sh tools/dist_train.sh configs/retina/retina_finetune.py 8

Models

Detection

Method Backbone Baseline(mAP) Finetuned(mAP) Download
RetinaNet R-50-FPN 36.5 36.5 Baseline/Pruned/Finetuned
ATSS* R-50-FPN 38.1 37.9 Baseline/Pruned/Finetuned
PAA* R-50-FPN 39.0 39.4 Baseline/Pruned/Finetuned
FSAF R-50-FPN 37.4 37.4 Baseline/Pruned/Finetuned

* indicate with no Group Normalization in heads.

Classification

Coming soon.

Please cite our paper in your publications if it helps your research.

@InProceedings{liu2021group,
  title = {Group Fisher Pruning for Practical Network Compression},
  author =       {Liu, Liyang and Zhang, Shilong and Kuang, Zhanghui and Zhou, Aojun and Xue, Jing-Hao and Wang, Xinjiang and Chen, Yimin and Yang, Wenming and Liao, Qingmin and Zhang, Wayne},
  booktitle = {Proceedings of the 38th International Conference on Machine Learning},
  year = {2021},
  series = {Proceedings of Machine Learning Research},
  month = {18--24 Jul},
  publisher ={PMLR},
}
Owner
Shilong Zhang
Shilong Zhang
Wanli Li and Tieyun Qian: Exploit a Multi-head Reference Graph for Semi-supervised Relation Extraction, IJCNN 2021

MRefG Wanli Li and Tieyun Qian: "Exploit a Multi-head Reference Graph for Semi-supervised Relation Extraction", IJCNN 2021 1. Requirements To reproduc

万理 5 Jul 26, 2022
StarGAN - Official PyTorch Implementation (CVPR 2018)

StarGAN - Official PyTorch Implementation ***** New: StarGAN v2 is available at https://github.com/clovaai/stargan-v2 ***** This repository provides t

Yunjey Choi 5.1k Jan 04, 2023
Python script for performing depth completion from sparse depth and rgb images using the msg_chn_wacv20. model in ONNX

ONNX msg_chn_wacv20 depth completion Python script for performing depth completion from sparse depth and rgb images using the msg_chn_wacv20 model in

Ibai Gorordo 19 Oct 22, 2022
High-Resolution Image Synthesis with Latent Diffusion Models

Latent Diffusion Models Requirements A suitable conda environment named ldm can be created and activated with: conda env create -f environment.yaml co

CompVis Heidelberg 5.6k Jan 04, 2023
This is the official pytorch implementation of AutoDebias, an automatic debiasing method for recommendation.

AutoDebias This is the official pytorch implementation of AutoDebias, a debiasing method for recommendation system. AutoDebias is proposed in the pape

Dong Hande 77 Nov 25, 2022
BEGAN in PyTorch

BEGAN in PyTorch This project is still in progress. If you are looking for the working code, use BEGAN-tensorflow. Requirements Python 2.7 Pillow tqdm

Taehoon Kim 260 Dec 07, 2022
Gradient-free global optimization algorithm for multidimensional functions based on the low rank tensor train format

ttopt Description Gradient-free global optimization algorithm for multidimensional functions based on the low rank tensor train (TT) format and maximu

5 May 23, 2022
Code for the paper "Zero-shot Natural Language Video Localization" (ICCV2021, Oral).

Zero-shot Natural Language Video Localization (ZSNLVL) by Pseudo-Supervised Video Localization (PSVL) This repository is for Zero-shot Natural Languag

Computer Vision Lab. @ GIST 37 Dec 27, 2022
This is the code for CVPR 2021 oral paper: Jigsaw Clustering for Unsupervised Visual Representation Learning

JigsawClustering Jigsaw Clustering for Unsupervised Visual Representation Learning Pengguang Chen, Shu Liu, Jiaya Jia Introduction This project provid

DV Lab 73 Sep 18, 2022
[CVPR 2021] MiVOS - Scribble to Mask module

MiVOS (CVPR 2021) - Scribble To Mask Ho Kei Cheng, Yu-Wing Tai, Chi-Keung Tang [arXiv] [Paper PDF] [Project Page] A simplistic network that turns scri

Rex Cheng 65 Dec 22, 2022
Vpw analyzer - A visual J1850 VPW analyzer written in Python

VPW Analyzer A visual J1850 VPW analyzer written in Python Requires Tkinter, Pan

7 May 01, 2022
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks

What is DeepHyper? DeepHyper is a software package that uses learning, optimization, and parallel computing to automate the design and development of

DeepHyper Team 214 Jan 08, 2023
Loopy belief propagation for factor graphs on discrete variables, in JAX!

PGMax implements general factor graphs for discrete probabilistic graphical models (PGMs), and hardware-accelerated differentiable loopy belief propagation (LBP) in JAX.

Vicarious 62 Dec 23, 2022
Convolutional Neural Network for 3D meshes in PyTorch

MeshCNN in PyTorch SIGGRAPH 2019 [Paper] [Project Page] MeshCNN is a general-purpose deep neural network for 3D triangular meshes, which can be used f

Rana Hanocka 1.4k Jan 04, 2023
HDR Video Reconstruction: A Coarse-to-fine Network and A Real-world Benchmark Dataset (ICCV 2021)

Code for HDR Video Reconstruction HDR Video Reconstruction: A Coarse-to-fine Network and A Real-world Benchmark Dataset (ICCV 2021) Guanying Chen, Cha

Guanying Chen 64 Nov 19, 2022
Self-Supervised Multi-Frame Monocular Scene Flow (CVPR 2021)

Self-Supervised Multi-Frame Monocular Scene Flow 3D visualization of estimated depth and scene flow (overlayed with input image) from temporally conse

Visual Inference Lab @TU Darmstadt 85 Dec 22, 2022
An attempt at the implementation of Glom, Geoffrey Hinton's new idea that integrates neural fields, predictive coding, top-down-bottom-up, and attention (consensus between columns)

GLOM - Pytorch (wip) An attempt at the implementation of Glom, Geoffrey Hinton's new idea that integrates neural fields, predictive coding,

Phil Wang 173 Dec 14, 2022
Code that accompanies the paper Semi-supervised Deep Kernel Learning: Regression with Unlabeled Data by Minimizing Predictive Variance

Semi-supervised Deep Kernel Learning This is the code that accompanies the paper Semi-supervised Deep Kernel Learning: Regression with Unlabeled Data

58 Oct 26, 2022