Official Implementation for Fast Training of Neural Lumigraph Representations using Meta Learning.

Overview

Fast Training of Neural Lumigraph Representations using Meta Learning

Project Page | Paper | Data

Alexander W. Bergman, Petr Kellnhofer, Gordon Wetzstein, Stanford University.
Official Implementation for Fast Training of Neural Lumigraph Representations using Meta Learning.

Usage

To get started, create a conda environment with all dependencies:

conda env create -f environment.yml
conda activate metanlrpp

Code Structure

The code is organized as follows:

  • experiment_scripts: directory containing scripts to for training and testing MetaNLR++ models.
    • pretrain_features.py: pre-train encoder and decoder networks
    • train_sdf_ibr_meta.py: train meta-learned initialization for encoder, decoder, aggregation fn, and neural SDF
    • test_sdf_ibr_meta.py: specialize meta-learned initialization to a specific scene
    • train_sdf_ibr.py: train NLR++ model from scratch without meta-learned initialization
    • test_sdf_ibr.py: evaluate performance on withheld views
  • configs: directory containing configs to reproduce experiments in the paper
    • nlrpp_nlr.txt: configuration for training NLR++ on the NLR dataset
    • nlrpp_dtu.txt: configuration for training NLR++ on the DTU dataset
    • nlrpp_nlr_meta.txt: configuration for training the MetaNLR++ initialization on the NLR dataset
    • nlrpp_dtu_meta.txt: configuration for training the MetaNLR++ initialization on the DTU dataset
    • nlrpp_nlr_metaspec.txt: configuration for training MetaNLR++ on the NLR dataset using the learned initialization
    • nlrpp_dtu_metaspec.txt: configuration for training MetaNLR++ on the DTU dataset using the learned initialization
  • data_processing: directory containing utility functions for processing data
  • torchmeta: torchmeta library for meta-learning
  • utils: directory containing various utility functions for rendering and visualization
  • loss_functions.py: file containing loss functions for evaluation
  • meta_modules.py: contains meta learning wrappers around standard modules using torchmeta
  • modules.py: contains standard modules for coodinate-based networks
  • modules_sdf.py: extends standard modules for coordinate-based network representations of signed-distance functions.
  • modules_unet.py: contains encoder and decoder modules used for image-space feature processing
  • scheduler.py: utilities for training schedule
  • training.py: training script
  • sdf_rendering.py: functions for rendering SDF
  • sdf_meshing.py: functions for meshing SDF
  • checkpoints: contains checkpoints to some pre-trained models (additional/ablation models by request)
  • assets: contains paths to checkpoints which are used as assets, and pre-computed buffers over multiple runs (if necessary)

Getting Started

Pre-training Encoder and Decoder

Pre-train the encoder and decoder using the FlyingChairsV2 training dataset as follows:

python experiment_scripts/pretrain_features.py --experiment_name XXX --batch_size X --dataset_path /path/to/FlyingChairs2/train

Alternatively, use the checkpoint in the checkpoints directory.

Training NLR++

Train a NLR++ model using the following command:

python experiment_scripts/train_sdf_ibr.py --config_filepath configs/nlrpp_dtu.txt --experiment_name XXX --dataset_path /path/to/dtu/scanXXX --checkpoint_img_encoder /path/to/pretrained/encdec

Note that we have uploaded our processed version of the DTU data here, and the NLR data can be found here.

Meta-learned Initialization (MetaNLR++)

Meta-learn the initialization for the encoder, decoder, aggregation function, and neural SDF using the following command:

python experiment_scripts/train_sdf_ibr_meta.py --config_filepath configs/nlrpp_dtu_meta.txt --experiment_name XXX --dataset_path /path/to/dtu/meta/training --reference_view 24 --checkpoint_img_encoder /path/to/pretrained/encdec

Some optimized initializations for the DTU and NLR datasets can be found in the data directory. Additional models can be provided upon request.

Training MetaNLR++ from Initialization

Use the meta-learned initialization to specialize to a specific scene using the following command:

python experiment_scripts/test_sdf_ibr_meta.py --config_filepath configs/nlrpp_dtu_metaspec.txt --experiment_name XXX --dataset_path /path/to/dtu/scanXXX --reference_view 24 --meta_initialization /path/to/learned/meta/initialization

Evaluation

Test the converged scene on withheld views using the following command:

python experiment_scripts/test_sdf_ibr.py --config_filepath configs/nlrpp_dtu.txt --experiment_name XXX --dataset_path /path/to/dtu/scanXXX --checkpoint_path_test /path/to/checkpoint/to/evaluate

Citation & Contact

If you find our work useful in your research, please cite

@inproceedings{bergman2021metanlr,
author = {Bergman, Alexander W. and Kellnhofer, Petr and Wetzstein, Gordon},
title = {Fast Training of Neural Lumigraph Representations using Meta Learning},
booktitle = {NeurIPS},
year = {2021},
}

If you have any questions or would like access to specific ablations or baselines presented in the paper or supplement (the code presented here is only a subset based off of the source code used to generate the results), please feel free to contact the authors. Alex can be contacted via e-mail at [email protected].

Owner
Alex
Alex
Pytorch GUI(demo) for iVOS(interactive VOS) and GIS (Guided iVOS)

GUI for iVOS(interactive VOS) and GIS (Guided iVOS) GUI Implementation of CVPR2021 paper "Guided Interactive Video Object Segmentation Using Reliabili

Yuk Heo 13 Dec 09, 2022
Code for paper ECCV 2020 paper: Who Left the Dogs Out? 3D Animal Reconstruction with Expectation Maximization in the Loop.

Who Left the Dogs Out? Evaluation and demo code for our ECCV 2020 paper: Who Left the Dogs Out? 3D Animal Reconstruction with Expectation Maximization

Benjamin Biggs 29 Dec 28, 2022
Neural Lexicon Reader: Reduce Pronunciation Errors in End-to-end TTS by Leveraging External Textual Knowledge

Neural Lexicon Reader: Reduce Pronunciation Errors in End-to-end TTS by Leveraging External Textual Knowledge This is an implementation of the paper,

Mutian He 19 Oct 14, 2022
Molecular AutoEncoder in PyTorch

MolEncoder Molecular AutoEncoder in PyTorch Install $ git clone https://github.com/cxhernandez/molencoder.git && cd molencoder $ python setup.py insta

Carlos Hernández 80 Dec 05, 2022
Official Implementation of "Tracking Grow-Finish Pigs Across Large Pens Using Multiple Cameras"

Multi Camera Pig Tracking Official Implementation of Tracking Grow-Finish Pigs Across Large Pens Using Multiple Cameras CVPR2021 CV4Animals Workshop P

44 Jan 06, 2023
Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch

Perceiver - Pytorch Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch Install $ pip install perceiver-pytorch Usage

Phil Wang 876 Dec 29, 2022
This repository contains code, network definitions and pre-trained models for working on remote sensing images using deep learning

Deep learning for Earth Observation This repository contains code, network definitions and pre-trained models for working on remote sensing images usi

Nicolas Audebert 447 Jan 05, 2023
GAN-generated image detection based on CNNs

GAN-image-detection This repository contains a GAN-generated image detector developed to distinguish real images from synthetic ones. The detector is

Image and Sound Processing Lab 17 Dec 15, 2022
The `rtdl` library + The official implementation of the paper

The `rtdl` library + The official implementation of the paper "Revisiting Deep Learning Models for Tabular Data"

Yandex Research 510 Dec 30, 2022
nextPARS, a novel Illumina-based implementation of in-vitro parallel probing of RNA structures.

nextPARS, a novel Illumina-based implementation of in-vitro parallel probing of RNA structures. Here you will find the scripts necessary to produce th

Jesse Willis 0 Jan 20, 2022
In this project we use both Resnet and Self-attention layer for cat, dog and flower classification.

cdf_att_classification classes = {0: 'cat', 1: 'dog', 2: 'flower'} In this project we use both Resnet and Self-attention layer for cdf-Classification.

3 Nov 23, 2022
Distance-Ratio-Based Formulation for Metric Learning

Distance-Ratio-Based Formulation for Metric Learning Environment Python3 Pytorch (http://pytorch.org/) (version 1.6.0+cu101) json tqdm Preparing datas

Hyeongji Kim 1 Dec 07, 2022
Fully Convlutional Neural Networks for state-of-the-art time series classification

Deep Learning for Time Series Classification As the simplest type of time series data, univariate time series provides a reasonably good starting poin

Stephen 572 Dec 23, 2022
Pytorch implementation of "Neural Wireframe Renderer: Learning Wireframe to Image Translations"

Neural Wireframe Renderer: Learning Wireframe to Image Translations Pytorch implementation of ideas from the paper Neural Wireframe Renderer: Learning

Yuan Xue 7 Nov 14, 2022
NeuralDiff: Segmenting 3D objects that move in egocentric videos

NeuralDiff: Segmenting 3D objects that move in egocentric videos Project Page | Paper + Supplementary | Video About This repository contains the offic

Vadim Tschernezki 14 Dec 05, 2022
An official implementation of "SFNet: Learning Object-aware Semantic Correspondence" (CVPR 2019, TPAMI 2020) in PyTorch.

PyTorch implementation of SFNet This is the implementation of the paper "SFNet: Learning Object-aware Semantic Correspondence". For more information,

CV Lab @ Yonsei University 87 Dec 30, 2022
[NeurIPS 2020] Semi-Supervision (Unlabeled Data) & Self-Supervision Improve Class-Imbalanced / Long-Tailed Learning

Rethinking the Value of Labels for Improving Class-Imbalanced Learning This repository contains the implementation code for paper: Rethinking the Valu

Yuzhe Yang 656 Dec 28, 2022
[SIGGRAPH'22] StyleGAN-XL: Scaling StyleGAN to Large Diverse Datasets

[Project] [PDF] This repository contains code for our SIGGRAPH'22 paper "StyleGAN-XL: Scaling StyleGAN to Large Diverse Datasets" by Axel Sauer, Katja

742 Jan 04, 2023
Code for 2021 NeurIPS --- Towards Multi-Grained Explainability for Graph Neural Networks

ReFine: Multi-Grained Explainability for GNNs This is the official code for Towards Multi-Grained Explainability for Graph Neural Networks (NeurIPS 20

Shirley (Ying-Xin) Wu 47 Dec 16, 2022
Versatile Generative Language Model

Versatile Generative Language Model This is the implementation of the paper: Exploring Versatile Generative Language Model Via Parameter-Efficient Tra

Zhaojiang Lin 17 Dec 02, 2022