Planning from Pixels in Environments with Combinatorially Hard Search Spaces -- NeurIPS 2021

Related tags

Deep LearningPPGS
Overview

PPGS: Planning from Pixels in Environments with Combinatorially Hard Search Spaces

PPGS Overview

Environment Setup

  • We recommend pipenv for creating and managing virtual environments (dependencies for other environment managers can be found in Pipfile)
git clone https://github.com/martius-lab/PPGS
cd ppgs
pipenv install
pipenv shell
  • For simplicity, this codebase is ready for training on two of the three environments (IceSlider and DigitJump). They are part of the puzzlegen package, which we provide here, and can be simply installed with
pip install -e https://github.com/martius-lab/puzzlegen
  • Offline datasets can be generated for training and validation. In the case of IceSlider we can use
python -m puzzlegen.extract_trajectories --record-dir /path/to/train_data --env-name ice_slider --start-level 0 --number-levels 1000 --max-steps 20 --n-repeat 20 --random 1
python -m puzzlegen.extract_trajectories --record-dir /path/to/test_data --env-name ice_slider --start-level 1000 --number-levels 1000 --max-steps 20 --n-repeat 5 --random 1
  • Finally, we can add the paths to the extracted datasets in default_params.json as data_params.train_path and data_params.test_path. We should also set the name of the environment for validation in data_params.env_name ("ice_slider" for IceSlider or "digit_jump" for DigitJump).

  • Training and evaluation are performed sequentially by running

python main.py

Configuration

All settings can be handled by editing default_config.json.

Param Default Info
optimizer_params.eps 1e-05 epsilon for Adam
train_params.seed null seed for training
train_params.epochs 40 # of training epochs
train_params.batch_size 128 batch size for training
train_params.save_every_n_epochs 5 how often to save models
train_params.val_every_n_epochs 2 how often to perform validation
train_params.lr_dict - dictionary of learning rates for each component
train_params.loss_weight_dict - dictionary of weights for the three loss functions
train_params.margin 0.1 latent margin epsilon
train_params.hinge_params - hyperparameters for margin loss
train_params.schedule [] learning rate schedule
model_params.name 'ppgs' name of the model to train in ['ppgs', 'latent']
model_params.load_model true whether to load saved model if present
model_params.filters [64, 128, 256, 512] encoder filters
model_params.embedding_size 16 dimensionality of latent space
model_params.normalize true whether to normalize embeddings
model_params.forward_layers 3 layers in MLP forward model for 'latent' world model
model_params.forward_units 256 units in MLP forward model for 'latent' world model
model_params.forward_ln true layer normalization in MLP forward model for 'latent' world model
model_params.inverse_layers 1 layers in MLP inverse model
model_params.inverse_units 32 units in MLP inverse model
model_params.inverse_ln true layer normalization in MLP inverse model
data_params.train_path '' path to training dataset
data_params.test_path '' path to validation dataset
data_params.env_name 'ice_slider' name of environment ('ice_slider' for IceSlider, 'digit_jump' for DigitJump
data_params.seq_len 2 number of steps for multi-step loss
data_params.shuffle true whether to shuffle datasets
data_params.normalize true whether to normalize observations
data_params.encode_position false enables positional encoding
data_params.env_params {} params to pass to environment
eval_params.evaluate_losses true whether to compute evaluation losses
eval_params.evaluate_rollouts true whether to compute solution rates
eval_params.eval_at [1,3,4] # of steps to evaluate at
eval_params.latent_eval_at [1,5,10] K for latent metrics
eval_params.seeds [2000] starting seed for evaluation levels
eval_params.num_levels 100 # evaluation levels
eval_params.batch_size 128 batch size for latent metrics evaluation
eval_params.planner_params.batch_size 256 cutoff for graph search
eval_params.planner_params.margin 0.1 latent margin for reidentification
eval_params.planner_params.early_stop true whether to stop when goal is found
eval_params.planner_params.backtrack false enables backtracking algorithm
eval_params.planner_params.penalize_visited false penalizes visited vertices in graph search
eval_params.planner_params.eps 0 enables epsilon greedy action selection
eval_params.planner_params.max_steps 256 maximal solution length
eval_params.planner_params.replan horizon 10 T_max for full planner
eval_params.planner_params.snap false snaps new vertices to visited ones
working_dir "results/ppgs" directory for checkpoints and results
Owner
Autonomous Learning Group
Autonomous Learning Group
Torch Implementation of "Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network"

Photo-Realistic-Super-Resoluton Torch Implementation of "Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network" [Paper]

Harry Yang 199 Dec 01, 2022
Official pytorch implementation of "Feature Stylization and Domain-aware Contrastive Loss for Domain Generalization" ACMMM 2021 (Oral)

Feature Stylization and Domain-aware Contrastive Loss for Domain Generalization This is an official implementation of "Feature Stylization and Domain-

22 Sep 22, 2022
PyTorch code for the paper "FIERY: Future Instance Segmentation in Bird's-Eye view from Surround Monocular Cameras"

FIERY This is the PyTorch implementation for inference and training of the future prediction bird's-eye view network as described in: FIERY: Future In

Wayve 406 Dec 24, 2022
Keepsake is a Python library that uploads files and metadata (like hyperparameters) to Amazon S3 or Google Cloud Storage

Keepsake Version control for machine learning. Keepsake is a Python library that uploads files and metadata (like hyperparameters) to Amazon S3 or Goo

Replicate 1.6k Dec 29, 2022
A simple log parser and summariser for IIS web server logs

IISLogFileParser A basic parser tool for IIS Logs which summarises findings from the log file. Inspired by the Gist https://gist.github.com/wh13371/e7

2 Mar 26, 2022
Predicting future trajectories of people in cameras of novel scenarios and views.

Pedestrian Trajectory Prediction Predicting future trajectories of pedestrians in cameras of novel scenarios and views. This repository contains the c

8 Sep 03, 2022
Code for the paper titled "Prabhupadavani: A Code-mixed Speech Translation Data for 25 languages"

Prabhupadavani: A Code-mixed Speech Translation Data for 25 languages Code for the paper titled "Prabhupadavani: A Code-mixed Speech Translation Data

Ayush Daksh 12 Dec 01, 2022
Official implementation of the network presented in the paper "M4Depth: A motion-based approach for monocular depth estimation on video sequences"

M4Depth This is the reference TensorFlow implementation for training and testing depth estimation models using the method described in M4Depth: A moti

Michaël Fonder 76 Jan 03, 2023
Repository accompanying the "Sign Pose-based Transformer for Word-level Sign Language Recognition" paper

by Matyáš Boháček and Marek Hrúz, University of West Bohemia Should you have any questions or inquiries, feel free to contact us here. Repository acco

Matyáš Boháček 30 Dec 30, 2022
DL course co-developed by YSDA, HSE and Skoltech

Deep learning course This repo supplements Deep Learning course taught at YSDA and HSE @fall'21. For previous iteration visit the spring21 branch. Lec

Yandex School of Data Analysis 1.3k Dec 30, 2022
Tightness-aware Evaluation Protocol for Scene Text Detection

TIoU-metric Release on 27/03/2019. This repository is built on the ICDAR 2015 evaluation code. If you propose a better metric and require further eval

Yuliang Liu 206 Nov 18, 2022
Learning Time-Critical Responses for Interactive Character Control

Learning Time-Critical Responses for Interactive Character Control Abstract This code implements the paper Learning Time-Critical Responses for Intera

Movement Research Lab 227 Dec 31, 2022
This is the source code of the solver used to compete in the International Timetabling Competition 2019.

ITC2019 Solver This is the source code of the solver used to compete in the International Timetabling Competition 2019. Building .NET Core (2.1 or hig

Edon Gashi 8 Jan 22, 2022
Code for PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning

PackNet: https://arxiv.org/abs/1711.05769 Pretrained models are available here: https://uofi.box.com/s/zap2p03tnst9dfisad4u0sfupc0y1fxt Datasets in Py

Arun Mallya 216 Jan 05, 2023
Use of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation

Attention Gated Networks (Image Classification & Segmentation) Pytorch implementation of attention gates used in U-Net and VGG-16 models. The framewor

Ozan Oktay 1.6k Dec 30, 2022
Code repository for the paper "Doubly-Trained Adversarial Data Augmentation for Neural Machine Translation" with instructions to reproduce the results.

Doubly Trained Neural Machine Translation System for Adversarial Attack and Data Augmentation Languages Experimented: Data Overview: Source Target Tra

Steven Tan 1 Aug 18, 2022
Patient-Survival - Using Python, I developed a Machine Learning model using classification techniques such as Random Forest and SVM classifiers to predict a patient's survival status that have undergone breast cancer surgery.

Patient-Survival - Using Python, I developed a Machine Learning model using classification techniques such as Random Forest and SVM classifiers to predict a patient's survival status that have underg

Nafis Ahmed 1 Dec 28, 2021
This is the official pytorch implementation of AutoDebias, an automatic debiasing method for recommendation.

AutoDebias This is the official pytorch implementation of AutoDebias, a debiasing method for recommendation system. AutoDebias is proposed in the pape

Dong Hande 77 Nov 25, 2022
deep learning model that learns to code with drawing in the Processing language

sketchnet sketchnet - processing code generator can we teach a computer to draw pictures with code. We use Processing and java/jruby code paired with

41 Dec 12, 2022
Learning Saliency Propagation for Semi-supervised Instance Segmentation

Learning Saliency Propagation for Semi-supervised Instance Segmentation PyTorch Implementation This repository contains: the PyTorch implementation of

Berkeley DeepDrive 68 Oct 18, 2022