Universal Probability Distributions with Optimal Transport and Convex Optimization

Overview

Sylvester normalizing flows for variational inference

Pytorch implementation of Sylvester normalizing flows, based on our paper:

Sylvester normalizing flows for variational inference (UAI 2018)
Rianne van den Berg*, Leonard Hasenclever*, Jakub Tomczak, Max Welling

*Equal contribution

Requirements

The latest release of the code is compatible with:

  • pytorch 1.0.0

  • python 3.7

Thanks to Martin Engelcke for adapting the code to provide this compatibility.

Version v0.3.0_2.7 is compatible with:

  • pytorch 0.3.0 WARNING: More recent versions of pytorch have different default flags for the binary cross entropy loss module: nn.BCELoss(). You have to adapt the appropriate flags if you want to port this code to a later vers
    ion.

  • python 2.7

Data

The experiments can be run on the following datasets:

  • static MNIST: dataset is in data folder;
  • OMNIGLOT: the dataset can be downloaded from link;
  • Caltech 101 Silhouettes: the dataset can be downloaded from link.
  • Frey Faces: the dataset can be downloaded from link.

Usage

Below, example commands are given for running experiments on static MNIST with different types of Sylvester normalizing flows, for 4 flows:

Orthogonal Sylvester flows
This example uses a bottleneck of size 8 (Q has 8 columns containing orthonormal vectors).

python main_experiment.py -d mnist -nf 4 --flow orthogonal --num_ortho_vecs 8 

Householder Sylvester flows
This example uses 8 Householder reflections per orthogonal matrix Q.

python main_experiment.py -d mnist -nf 4 --flow householder --num_householder 8

Triangular Sylvester flows

python main_experiment.py -d mnist -nf 4 --flow triangular 

To run an experiment with other types of normalizing flows or just with a factorized Gaussian posterior, see below.


Factorized Gaussian posterior

python main_experiment.py -d mnist --flow no_flow

Planar flows

python main_experiment.py -d mnist -nf 4 --flow planar

Inverse Autoregressive flows
This examples uses MADEs with 320 hidden units.

python main_experiment.py -d mnist -nf 4 --flow iaf --made_h_size 320

More information about additional argument options can be found by running ```python main_experiment.py -h```

Cite

Please cite our paper if you use this code in your own work:

@inproceedings{vdberg2018sylvester,
  title={Sylvester normalizing flows for variational inference},
  author={van den Berg, Rianne and Hasenclever, Leonard and Tomczak, Jakub and Welling, Max},
  booktitle={proceedings of the Conference on Uncertainty in Artificial Intelligence (UAI)},
  year={2018}
}
Comments
  • about log_p_zk

    about log_p_zk

    Hi Rianne, This is a great code, and I have a little question about logp(zk), we hope p(zk) in VAE can be a distribution whose form is no fixed, but it seems that the calculate of logp(zk) in line81 of loss.py imply that p(zk) is a standard Gaussion. Are there some mistakes about my understanding?
    Thank your for this code

    opened by Archer666 10
  • loss = bce + beta * kl

    loss = bce + beta * kl

    hello Rianne: Thanks very much. I am a bit confused with line 44 in loss.py : loss = bce + beta * kl. Based on equation 3 in Tomczak's paper (Improving Variational Auto-Encoder Using Householder Flows), shouldn't "loss = bce - beta * kl "? Also, why use -ELBO instead of ELBO when reporting your metrics? Thanks

    opened by tumis1946 4
  • PyTorch_v1 and Python3 compatibility

    PyTorch_v1 and Python3 compatibility

    Hi Rianne,

    This PR contains a 'minimal' set of changes to run the code with the latest PyTorch versions and Python 3 ( #1 #2 )

    It is 'minimal' in the sense that I only made changes that affect functionality. There are additional cosmetic changes that could be made; e.g. Variable(), the volatile flag, and F.sigmoid() have been deprecated but they should not affect functionality.

    I tested the changes with PyTorch 1.0.0 and Python 3.7 on MNIST and Freyfaces, giving me similar results for the baseline VAE without any flows.

    I am not sure if more rigorous test should be done and if you want to merge this into master or keep a separate branch.

    Best, Martin

    opened by martinengelcke 1
  • PR for PyTorch 1.+ and Python 3 support

    PR for PyTorch 1.+ and Python 3 support

    Hi Rianne,

    Thank you for this really nice code release :)

    I cloned the repo and made some changes so that it runs with PyTorch 1.+ and Python 3. Also solved the issue mentioned in #1 . I tested the changes on MNIST (binary input) and Freyfaces (multinomial input), giving similar results to the original code.

    If you are interested in reviewing and potentially adding this to the repo, I would be happy to clean things up and make a PR.

    Best, Martin

    opened by martinengelcke 1
  • RuntimeError in default main experiment

    RuntimeError in default main experiment

    Hi Rianne,

    I'm trying to run the default experiment on cpu with a small latent space dimension (z=5):

    python main_experiment.py -d mnist --flow no_flow -nc --z_size 5

    Which unfortunately gives the following error:

    Traceback (most recent call last):
      File "main_experiment.py", line 278, in <module>
        run(args, kwargs)
      File "main_experiment.py", line 189, in run
        tr_loss = train(epoch, train_loader, model, optimizer, args)
      File ".../sylvester-flows/optimization/training.py", line 39, in train
        loss.backward()
      File "//anaconda/envs/dl/lib/python3.6/site-packages/torch/tensor.py", line 102, in backward
        torch.autograd.backward(self, gradient, retain_graph, create_graph)
      File "//anaconda/envs/dl/lib/python3.6/site-packages/torch/autograd/__init__.py", line 90, in backward
        allow_unreachable=True)  # allow_unreachable flag
    RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation
    

    I am using PyTorch version 1.0.0 and did not modify the code.

    opened by trdavidson 1
  • How to sample from latent distribution

    How to sample from latent distribution

    Hello,

    I was wondering how I can generate samples using the decoder network after training. In a VAE, I would just sample from the prior distribution z~N(0,1) and generate a data point using the decoder. In TriangularSylvesterVAE, however, I also have to provide hyperparameters lambda(x) that depend on the input. How can I sample from my latent distribution and generate samples from it?

    I am new to normalizing flows in general and would appreciate any help.

    opened by crlz182 2
Releases(v1.0.0_3.7)
  • v1.0.0_3.7(Jul 5, 2019)

    Sylvester Normalizing Flow repository compatible with Pytorch 1.0.0 and Python 3.7. Thanks to martinengelcke for taking care of this compatibility.

    Source code(tar.gz)
    Source code(zip)
  • v0.3.0_2.7(Jul 5, 2019)

Owner
Rianne van den Berg
Senior researcher @Microsoft research Amsterdam. Formerly at Google Brain and University of Amsterdam
Rianne van den Berg
A embed able annotation tool for end to end cross document co-reference

CoRefi CoRefi is an emebedable web component and stand alone suite for exaughstive Within Document and Cross Document Coreference Anntoation. For a de

PythicCoder 39 Dec 12, 2022
Complete system for facial identity system. Include one-shot model, database operation, features visualization, monitoring

Complete system for facial identity system. Include one-shot model, database operation, features visualization, monitoring

2 Dec 28, 2021
TipToiDog - Tip Toi Dog With Python

TipToiDog Was ist dieses Projekt? Meine 5-jährige Tochter spielt sehr gerne das

1 Feb 07, 2022
RGB-D Local Implicit Function for Depth Completion of Transparent Objects

RGB-D Local Implicit Function for Depth Completion of Transparent Objects [Project Page] [Paper] Overview This repository maintains the official imple

NVIDIA Research Projects 43 Dec 12, 2022
Character Controllers using Motion VAEs

Character Controllers using Motion VAEs This repo is the codebase for the SIGGRAPH 2020 paper with the title above. Please find the paper and demo at

Electronic Arts 165 Jan 03, 2023
Pose estimation with MoveNet Lightning

Pose Estimation With MoveNet Lightning MoveNet is the TensorFlow pre-trained model that identifies 17 different key points of the human body. It is th

Yash Vora 2 Jan 04, 2022
Airbus Ship Detection Challenge

Airbus Ship Detection Challenge This is an open solution to the Airbus Ship Detection Challenge. Our goals We are building entirely open solution to t

minerva.ml 55 Nov 29, 2022
Learning Skeletal Articulations with Neural Blend Shapes

This repository provides an end-to-end library for automatic character rigging and blend shapes generation as well as a visualization tool. It is based on our work Learning Skeletal Articulations wit

Peizhuo 504 Dec 30, 2022
InferPy: Deep Probabilistic Modeling with Tensorflow Made Easy

InferPy: Deep Probabilistic Modeling Made Easy InferPy is a high-level API for probabilistic modeling written in Python and capable of running on top

PGM-Lab 141 Oct 13, 2022
Source code for "Pack Together: Entity and Relation Extraction with Levitated Marker"

PL-Marker Source code for Pack Together: Entity and Relation Extraction with Levitated Marker. Quick links Overview Setup Install Dependencies Data Pr

THUNLP 173 Dec 30, 2022
[Official] Exploring Temporal Coherence for More General Video Face Forgery Detection(ICCV 2021)

Exploring Temporal Coherence for More General Video Face Forgery Detection(FTCN) Yinglin Zheng, Jianmin Bao, Dong Chen, Ming Zeng, Fang Wen Accepted b

57 Dec 28, 2022
Model Agnostic Interpretability for Multiple Instance Learning

MIL Model Agnostic Interpretability This repo contains the code for "Model Agnostic Interpretability for Multiple Instance Learning". Overview Executa

Joe Early 10 Dec 17, 2022
A project which aims to protect your privacy using inexpensive hardware and easily modifiable software

Protecting your privacy using an ESP32, an IR sensor and a python script This project, which I personally call the "never-gonna-catch-me-in-the-act-ev

8 Oct 10, 2022
CoSMA: Convolutional Semi-Regular Mesh Autoencoder. From Paper "Mesh Convolutional Autoencoder for Semi-Regular Meshes of Different Sizes"

Mesh Convolutional Autoencoder for Semi-Regular Meshes of Different Sizes Implementation of CoSMA: Convolutional Semi-Regular Mesh Autoencoder arXiv p

Fraunhofer SCAI 10 Oct 11, 2022
M3DSSD: Monocular 3D Single Stage Object Detector

M3DSSD: Monocular 3D Single Stage Object Detector Setup pytorch 0.4.1 Preparation Download the full KITTI detection dataset. Then place a softlink (or

mumianyuxin 64 Dec 27, 2022
This repo is a C++ version of yolov5_deepsort_tensorrt. Packing all C++ programs into .so files, using Python script to call C++ programs further.

yolov5_deepsort_tensorrt_cpp Introduction This repo is a C++ version of yolov5_deepsort_tensorrt. And packing all C++ programs into .so files, using P

41 Dec 27, 2022
Imposter-detector-2022 - HackED 2022 Team 3IQ - 2022 Imposter Detector

HackED 2022 Team 3IQ - 2022 Imposter Detector By Aneeljyot Alagh, Curtis Kan, Jo

Joshua Ji 3 Aug 20, 2022
Self-Supervised Learning with Data Augmentations Provably Isolates Content from Style

Self-Supervised Learning with Data Augmentations Provably Isolates Content from Style [NeurIPS 2021] Official code to reproduce the results and data p

Yash Sharma 27 Sep 19, 2022
Weakly Supervised Dense Event Captioning in Videos, i.e. generating multiple sentence descriptions for a video in a weakly-supervised manner.

WSDEC This is the official repo for our NeurIPS paper Weakly Supervised Dense Event Captioning in Videos. Description Repo directories ./: global conf

Melon(Xuguang Duan) 96 Nov 01, 2022
A general-purpose programming language, focused on simplicity, safety and stability.

The Rivet programming language A general-purpose programming language, focused on simplicity, safety and stability. Rivet's goal is to be a very power

The Rivet programming language 17 Dec 29, 2022