Diverse Image Generation via Self-Conditioned GANs

Overview

Diverse Image Generation via Self-Conditioned GANs

Project | Paper

Diverse Image Generation via Self-Conditioned GANs
Steven Liu, Tongzhou Wang, David Bau, Jun-Yan Zhu, Antonio Torralba
MIT, Adobe Research
in CVPR 2020.

Teaser

Our proposed self-conditioned GAN model learns to perform clustering and image synthesis simultaneously. The model training requires no manual annotation of object classes. Here, we visualize several discovered clusters for both Places365 (top) and ImageNet (bottom). For each cluster, we show both real images and the generated samples conditioned on the cluster index.

Getting Started

Installation

  • Clone this repo:
git clone https://github.com/stevliu/self-conditioned-gan.git
cd self-conditioned-gan
  • Install the dependencies
conda create --name selfcondgan python=3.6
conda activate selfcondgan
conda install --file requirements.txt
conda install -c conda-forge tensorboardx

Training and Evaluation

  • Train a model on CIFAR:
python train.py configs/cifar/selfcondgan.yaml
  • Visualize samples and inferred clusters:
python visualize_clusters.py configs/cifar/selfcondgan.yaml --show_clusters

The samples and clusters will be saved to output/cifar/selfcondgan/clusters. If this directory lies on an Apache server, you can open the URL to output/cifar/selfcondgan/clusters/+lightbox.html in the browser and visualize all samples and clusters in one webpage.

  • Evaluate the model's FID: You will need to first gather a set of ground truth train set images to compute metrics against.
python utils/get_gt_imgs.py --cifar
python metrics.py configs/cifar/selfcondgan.yaml --fid --every -1

You can also evaluate with other metrics by appending additional flags, such as Inception Score (--inception), the number of covered modes + reverse-KL divergence (--modes), and cluster metrics (--cluster_metrics).

Pretrained Models

You can load and evaluate pretrained models on ImageNet and Places. If you have access to ImageNet or Places directories, first fill in paths to your ImageNet and/or Places dataset directories in configs/imagenet/default.yaml and configs/places/default.yaml respectively. You can use the following config files with the evaluation scripts, and the code will automatically download the appropriate models.

configs/pretrained/imagenet/selfcondgan.yaml
configs/pretrained/places/selfcondgan.yaml

configs/pretrained/imagenet/conditional.yaml
configs/pretrained/places/conditional.yaml

configs/pretrained/imagenet/baseline.yaml
configs/pretrained/places/baseline.yaml

Evaluation

Visualizations

To visualize generated samples and inferred clusters, run

python visualize_clusters.py config-file

You can set the flag --show_clusters to also visualize the real inferred clusters, but this requires that you have a path to training set images.

Metrics

To obtain generation metrics, fill in paths to your ImageNet or Places dataset directories in utils/get_gt_imgs.py and then run

python utils/get_gt_imgs.py --imagenet --places

to precompute batches of GT images for FID/FSD evaluation.

Then, you can use

python metrics.py config-file

with the appropriate flags compute the FID (--fid), FSD (--fsd), IS (--inception), number of modes covered/ reverse-KL divergence (--modes) and clustering metrics (--cluster_metrics) for each of the checkpoints.

Training models

To train a model, set up a configuration file (examples in /configs), and run

python train.py config-file

An example config of self-conditioned GAN on ImageNet is config/imagenet/selfcondgan.yaml and on Places is config/places/selfcondgan.yaml.

Some models may be too large to fit on one GPU, so you may want to add --devices DEVICE_NUMBERS as an additional flag to do multi GPU training.

2D-experiments

For synthetic dataset experiments, first go into the 2d_mix directory.

To train a self-conditioned GAN on the 2D-ring and 2D-grid dataset, run

python train.py --clusterer selfcondgan --data_type ring
python train.py --clusterer selfcondgan --data_type grid

You can test several other configurations via the command line arguments.

Acknowledgments

This code is heavily based on the GAN-stability code base. Our FSD code is taken from the GANseeing work. To compute inception score, we use the code provided from Shichang Tang. To compute FID, we use the code provided from TTUR. We also use pretrained classifiers given by the pytorch-playground.

We thank all the authors for their useful code.

Citation

If you use this code for your research, please cite the following work.

@inproceedings{liu2020selfconditioned,
 title={Diverse Image Generation via Self-Conditioned GANs},
 author={Liu, Steven and Wang, Tongzhou and Bau, David and Zhu, Jun-Yan and Torralba, Antonio},
 booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
 year={2020}
}
This code is a toolbox that uses Torch library for training and evaluating the ERFNet architecture for semantic segmentation.

ERFNet This code is a toolbox that uses Torch library for training and evaluating the ERFNet architecture for semantic segmentation. NEW!! New PyTorch

Edu 104 Jan 05, 2023
Very Deep Convolutional Networks for Large-Scale Image Recognition

pytorch-vgg Some scripts to convert the VGG-16 and VGG-19 models [1] from Caffe to PyTorch. The converted models can be used with the PyTorch model zo

Justin Johnson 217 Dec 05, 2022
deep learning for image processing including classification and object-detection etc.

深度学习在图像处理中的应用教程 前言 本教程是对本人研究生期间的研究内容进行整理总结,总结的同时也希望能够帮助更多的小伙伴。后期如果有学习到新的知识也会与大家一起分享。 本教程会以视频的方式进行分享,教学流程如下: 1)介绍网络的结构与创新点 2)使用Pytorch进行网络的搭建与训练 3)使用Te

WuZhe 13.6k Jan 04, 2023
Official Python implementation of the 'Sparse deconvolution'-v0.3.0

Sparse deconvolution Python v0.3.0 Official Python implementation of the 'Sparse deconvolution', and the CPU (NumPy) and GPU (CuPy) calculation backen

Weisong Zhao 23 Dec 28, 2022
A Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming.

Master status: Development status: Package information: TPOT stands for Tree-based Pipeline Optimization Tool. Consider TPOT your Data Science Assista

Epistasis Lab at UPenn 8.9k Dec 30, 2022
Code for Multiple Instance Active Learning for Object Detection, CVPR 2021

Language: 简体中文 | English Introduction This is the code for Multiple Instance Active Learning for Object Detection, CVPR 2021. Installation A Linux pla

Tianning Yuan 269 Dec 21, 2022
Minimal fastai code needed for working with pytorch

fastai_minima A mimal version of fastai with the barebones needed to work with Pytorch #all_slow Install pip install fastai_minima How to use This lib

Zachary Mueller 14 Oct 21, 2022
Code for "On Memorization in Probabilistic Deep Generative Models"

On Memorization in Probabilistic Deep Generative Models This repository contains the code necessary to reproduce the experiments in On Memorization in

The Alan Turing Institute 3 Jun 09, 2022
Imitating Deep Learning Dynamics via Locally Elastic Stochastic Differential Equations

Imitating Deep Learning Dynamics via Locally Elastic Stochastic Differential Equations This repo contains official code for the NeurIPS 2021 paper Imi

Jiayao Zhang 2 Oct 18, 2021
The official PyTorch implementation of Curriculum by Smoothing (NeurIPS 2020, Spotlight).

Curriculum by Smoothing (NeurIPS 2020) The official PyTorch implementation of Curriculum by Smoothing (NeurIPS 2020, Spotlight). For any questions reg

PAIR Lab 36 Nov 23, 2022
System-oriented IR evaluations are limited to rather abstract understandings of real user behavior

Validating Simulations of User Query Variants This repository contains the scripts of the experiments and evaluations, simulated queries, as well as t

IR Group at Technische Hochschule Köln 2 Nov 23, 2022
Material related to the Principles of Cloud Computing course.

CloudComputingCourse Material related to the Principles of Cloud Computing course. This repository comprises material that I use to teach my Principle

Aniruddha Gokhale 15 Dec 02, 2022
An Efficient Implementation of Analytic Mesh Algorithm for 3D Iso-surface Extraction from Neural Networks

AnalyticMesh Analytic Marching is an exact meshing solution from neural networks. Compared to standard methods, it completely avoids geometric and top

Karbo 45 Dec 21, 2022
code for paper "Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?"

Does Unsupervised Architecture Representation Learning Help Neural Architecture Search? Code for paper: Does Unsupervised Architecture Representation

39 Dec 17, 2022
TensorFlow implementation of ENet, trained on the Cityscapes dataset.

segmentation TensorFlow implementation of ENet (https://arxiv.org/pdf/1606.02147.pdf) based on the official Torch implementation (https://github.com/e

Fredrik Gustafsson 248 Dec 16, 2022
Code of paper Interact, Embed, and EnlargE (IEEE): Boosting Modality-specific Representations for Multi-Modal Person Re-identification.

Interact, Embed, and EnlargE (IEEE): Boosting Modality-specific Representations for Multi-Modal Person Re-identification We provide the codes for repr

12 Dec 12, 2022
This is an official implementation for the WTW Dataset in "Parsing Table Structures in the Wild " on table detection and table structure recognition.

WTW-Dataset This is an official implementation for the WTW Dataset in "Parsing Table Structures in the Wild " on ICCV 2021. Here, you can download the

109 Dec 29, 2022
Official implementation of Sparse Transformer-based Action Recognition

STAR Official implementation of S parse T ransformer-based A ction R ecognition Dataset download NTU RGB+D 60 action recognition of 2D/3D skeleton fro

Chonghan_Lee 15 Nov 02, 2022
Pytorch implementation of MaskGIT: Masked Generative Image Transformer

Pytorch implementation of MaskGIT: Masked Generative Image Transformer

Dominic Rampas 247 Dec 16, 2022
Python Interview Questions

Python Interview Questions Clone the code to your computer. You need to understand the code in main.py and modify the content in if __name__ =='__main

ClassmateLin 575 Dec 28, 2022