Official implementations of EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis.

Overview

EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis

This repo contains the official implementations of EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis. Details are listed below:

  1. The config file for the experiments are under the directory of configs/.
  2. The pruning algorithms are in pruner/. Please note that:
    (1) fisher_diag_pruner.py implements C-OBD.
    (2) kfac_eigen_pruner.py implements EigenDamage.
    (3) kfac_full_pruner.py implements C-OBS.
    (4) kfac_OBD_F2.py implements kron-OBD.
    (5) kfac_OBS_F2.py implements kron-OBS.
    (6) kfac_eigen_svd_pruner.py implements EigenDamage Depthwise Separable.

Requirements

Python3.6, Pytorch 0.4.1

pip install https://download.pytorch.org/whl/cu90/torch-0.4.1-cp36-cp36m-linux_x86_64.whl
pip install torchvision
pip install tqdm
pip install tensorflow
pip install tensorboardX
pip install easydict
pip install scikit-tensor

Dataset

  1. Download tiny imagenet from "https://tiny-imagenet.herokuapp.com", and place it in ../data/tiny_imagenet. Please make sure there will be two folders, train and val, under the directory of ../data/tiny_imagenet. In either train or val, there will be 200 folders storing the images of each category.

  2. For cifar datasets, it will be automatically downloaded.

How to run?

1. Pretrain model

You can also download the pretrained model from https://drive.google.com/file/d/1hMxj6NUCE1RP9p_ZZpJPhryk2RPU4I-_/view?usp=sharing.

# for pretraining CIFAR10/CIFAR100
$ python main_pretrain.py --learning_rate 0.1 --weight_decay 0.0002 --dataset cifar10 --epoch 200

# for pretraining Tiny-ImageNet
$ python main_pretrain.py --learning_rate 0.1 --weight_decay 0.0002 --dataset tiny_imagenet --epoch 300

2. Pruning

# for pruning with EigenDamage, CIFAR10, VGG19 (one pass)
$ python main_prune.py --config ./configs/exp_for_cifar/cifar10/vgg19/one_pass/base/kfacf_eigen_base.json

# for pruning with EigenDamage, CIFAR100, VGG19
$ python main_prune.py --config ./configs/exp_for_cifar/cifar100/vgg19/one_pass/base/kfacf_eigen_base.json

# for pruning with EigenDamage, TinyImageNet, VGG19
$ python main_prune.py --config ./configs/exp_for_tiny_imagenet/tiny_imagenet/vgg19/one_pass/base/kfacf_eigen_base.json

# for pruning with EigenDamage + Depthwise separable, CIFAR100, VGG19
$ python main_prune_separable.py --config ./configs/exp_for_svd/cifar100/vgg19/one_pass/base/svd_eigendamage.json

Contact

If you have any questions or suggestions about the code or paper, please do not hesitate to contact with Chaoqi Wang([email protected] or [email protected]) and Guodong Zhang([email protected] or [email protected]).

Citation

To cite this work, please use

@InProceedings{wang2019eigen,
  title = 	 {{E}igen{D}amage: Structured Pruning in the {K}ronecker-Factored Eigenbasis},
  author = 	 {Wang, Chaoqi and Grosse, Roger and Fidler, Sanja and Zhang, Guodong},
  booktitle = 	 {Proceedings of the 36th International Conference on Machine Learning},
  pages = 	 {6566--6575},
  year = 	 {2019},
  volume = 	 {97},
  publisher = {PMLR},
  pdf = 	 {http://proceedings.mlr.press/v97/wang19g/wang19g.pdf},
  url = 	 {http://proceedings.mlr.press/v97/wang19g.html},
}

Owner
Chaoqi Wang
Machine learning
Chaoqi Wang
PyTorch Extension Library of Optimized Autograd Sparse Matrix Operations

PyTorch Sparse This package consists of a small extension library of optimized sparse matrix operations with autograd support. This package currently

Matthias Fey 757 Jan 04, 2023
PyTorch implementation of TabNet paper : https://arxiv.org/pdf/1908.07442.pdf

README TabNet : Attentive Interpretable Tabular Learning This is a pyTorch implementation of Tabnet (Arik, S. O., & Pfister, T. (2019). TabNet: Attent

DreamQuark 2k Dec 27, 2022
Training PyTorch models with differential privacy

Opacus is a library that enables training PyTorch models with differential privacy. It supports training with minimal code changes required on the cli

1.3k Dec 29, 2022
270 Dec 24, 2022
PyTorch implementations of normalizing flow and its variants.

PyTorch implementations of normalizing flow and its variants.

Tatsuya Yatagawa 55 Dec 01, 2022
PyTorch wrappers for using your model in audacity!

PyTorch wrappers for using your model in audacity!

130 Dec 14, 2022
Fast and Easy-to-use Distributed Graph Learning for PyTorch Geometric

Fast and Easy-to-use Distributed Graph Learning for PyTorch Geometric

Quiver Team 221 Dec 22, 2022
Reformer, the efficient Transformer, in Pytorch

Reformer, the Efficient Transformer, in Pytorch This is a Pytorch implementation of Reformer https://openreview.net/pdf?id=rkgNKkHtvB It includes LSH

Phil Wang 1.8k Jan 06, 2023
A tutorial on "Bayesian Compression for Deep Learning" published at NIPS (2017).

Code release for "Bayesian Compression for Deep Learning" In "Bayesian Compression for Deep Learning" we adopt a Bayesian view for the compression of

Karen Ullrich 190 Dec 30, 2022
Fast, general, and tested differentiable structured prediction in PyTorch

Torch-Struct: Structured Prediction Library A library of tested, GPU implementations of core structured prediction algorithms for deep learning applic

HNLP 1.1k Jan 07, 2023
Learning Sparse Neural Networks through L0 regularization

Example implementation of the L0 regularization method described at Learning Sparse Neural Networks through L0 regularization, Christos Louizos, Max W

AMLAB 202 Nov 10, 2022
Pytorch bindings for Fortran

Pytorch bindings for Fortran

Dmitry Alexeev 46 Dec 29, 2022
Tacotron 2 - PyTorch implementation with faster-than-realtime inference

Tacotron 2 (without wavenet) PyTorch implementation of Natural TTS Synthesis By Conditioning Wavenet On Mel Spectrogram Predictions. This implementati

NVIDIA Corporation 4.1k Jan 03, 2023
PyTorch to TensorFlow Lite converter

PyTorch to TensorFlow Lite converter

Omer Ferhat Sarioglu 140 Dec 13, 2022
A lightweight wrapper for PyTorch that provides a simple declarative API for context switching between devices, distributed modes, mixed-precision, and PyTorch extensions.

A lightweight wrapper for PyTorch that provides a simple declarative API for context switching between devices, distributed modes, mixed-precision, and PyTorch extensions.

Fidelity Investments 56 Sep 13, 2022
A tiny package to compare two neural networks in PyTorch

Compare neural networks by their feature similarity

Anand Krishnamoorthy 180 Dec 30, 2022
A PyTorch implementation of Learning to learn by gradient descent by gradient descent

Intro PyTorch implementation of Learning to learn by gradient descent by gradient descent. Run python main.py TODO Initial implementation Toy data LST

Ilya Kostrikov 300 Dec 11, 2022
An implementation of Performer, a linear attention-based transformer, in Pytorch

Performer - Pytorch An implementation of Performer, a linear attention-based transformer variant with a Fast Attention Via positive Orthogonal Random

Phil Wang 900 Dec 22, 2022
Tez is a super-simple and lightweight Trainer for PyTorch. It also comes with many utils that you can use to tackle over 90% of deep learning projects in PyTorch.

Tez: a simple pytorch trainer NOTE: Currently, we are not accepting any pull requests! All PRs will be closed. If you want a feature or something does

abhishek thakur 1.1k Jan 04, 2023
GPU-accelerated PyTorch implementation of Zero-shot User Intent Detection via Capsule Neural Networks

GPU-accelerated PyTorch implementation of Zero-shot User Intent Detection via Capsule Neural Networks This repository implements a capsule model Inten

Joel Huang 15 Dec 24, 2022