This repository is dedicated to developing and maintaining code for experiments with wide neural networks.

Overview

Wide-Networks

This repository contains the code of various experiments on wide neural networks. In particular, we implement classes for abc-parameterizations of NNs as defined by (Yang & Hu 2021). Although an equivalent description can be given using only ac-parameterizations, we keep the 3 scales (a, b and c) in the code to allow more flexibility depending on how we want to approach the problem of dealing with infinitely wide NNs.

Structure of the code

The BaseModel class

All the code related to neural networks is in the directory pytorch. The different models we have implemented are in this directory along with the base class found in the file base_model.py which implements the generic attributes and methods all our NNs classes will share.

The BaseModel class inherits from the Pytorch Lightning module, and essentially defines the necessary attributes for any NN to work properly, namely the architecture (which is defined in the _build_model() method), the activation function (we consider the same activation function at each layer), the loss function, the optimizer and the initializer for the parameters of the network.

Optionally, the BaseModel class can define attributes for the normalization (e.g. BatchNorm, LayerNorm, etc) and the scheduler, and any of the aforementioned attributes (optional or not) can be customized depending on the needs (see examples for the scheduler of ipllr and the initializer of abc_param).

The ModelConfig class

All the hyper-parameters which define the model (depth, width, activation function name, loss name, optimizer name, etc) have to be passed as argument to _init_() as an object of the class ModelConfig (pytorch/configs/model.py). This class reads from a yaml config file which defines all the necessary objects for a NN (see examples in pytorch/configs). Essentially, the class ModelConfig is here so that one only has to set the yaml config file properly and then the attributes are correctly populated in BaseModel via the class ModelConfig.

abc-parameterizations

The code for abc-parameterizations (Yang & Hu 2021) can be found in pytorch/abc_params. There we define the base class for abc-parameterizations, mainly setting the layer, init and lr scales from the values of a,b,c, as well as defining the initial parameters through Gaussians of appropriate variance depending on the value of b and the activation function.

Everything that is architecture specific (fully-connected, conv, residual, etc) is left out of this base class and has to be implemented in the _build_model() method of the child class (see examples in pytorch/abc_params/fully_connected). We also define there the base classes for the ntk, muP (Yang & Hu 2021), ip and ipllr parameterizations, and there fully-connected implementations in pytorch/abc_params/fully_connected.

Experiment runs

Setup

Before running any experiment, make sure you first install all the necessary packages:

pip3 install -r requirements.txt

You can optionally create a virtual environment through

python3 -m venv your_env_dir

then activate it with

source your_env_dir/bin/activate

and then install the requirements once the environment is activated. Now, if you haven't installed the wide-networks library in site-packages, before running the command for your experiment, make sure you first add the wide-networks library to the PYTHONPATH by running the command

export PYTHONPATH=$PYTHONPATH:"$PWD"

from the root directory (wide-networks/.) of where the wide-networks library is located.

Python jobs

We define python jobs which can be run with arguments from the command line in the directory jobs. Mainly, those jobs launch a training / val / test pipeline for a given model using the Lightning module, and the results are collected in a dictionary which is saved to a pickle file a the end of training for later examination. Additionally, metrics are logged in TensorBoard and can be visualized during training with the command

tensorboard --logdir=`your_experiment_dir`

We have written jobs to launch experiments on MNIST and CIFAR-10 with the fully connected version of different models such as muP (Yang & Hu 2021), IP-LLR, Naive-IP which can be found in jobs/abc_parameterizations. Arguments can be passed to those Python scripts through the command line, but they are optional and the default values will be used if the parameters of the script are not manually set. For example, the command

python3 jobs/abc_parameterizations/fc_muP_run.py --activation="relu" --n_steps=600 --dataset="mnist"

will launch a training / val / test pipeline with ReLU as the activation function, 600 SGD steps and the MNIST dataset. The other parameters of the run (e.g. the base learning rate and batch size) will have their default values. The jobs will automatically create a directory (and potentially subdirectories) for the experiment and save there the python logs, the tensorboard events and the results dictionary saved to a pickle file as well as the checkpoints saved for the network.

Visualizing results

To visualize the results after training for a given experiment, one can launch the notebook experiments-results.ipynb located in pytorch/notebooks/training/abc_parameterizations, and simply change the arguments in the "Set variables" cell to load the results from the corresponding experiment. Then running all the cells will produce (and save) some figures related to the training phase (e.g. loss vs. steps).

Owner
Karl Hajjar
PhD student at Laboratoire de Mathématiques d'Orsay
Karl Hajjar
A tool for making map images from OpenTTD save games

OpenTTD Surveyor A tool for making map images from OpenTTD save games. This is not part of the main OpenTTD codebase, nor is it ever intended to be pa

Aidan Randle-Conde 9 Feb 15, 2022
A boosting-based Multiple Instance Learning (MIL) package that includes MIL-Boost and MCIL-Boost

A boosting-based Multiple Instance Learning (MIL) package that includes MIL-Boost and MCIL-Boost

Jun-Yan Zhu 27 Aug 08, 2022
Practical Single-Image Super-Resolution Using Look-Up Table

Practical Single-Image Super-Resolution Using Look-Up Table [Paper] Dependency Python 3.6 PyTorch glob numpy pillow tqdm tensorboardx 1. Training deep

Younghyun Jo 116 Dec 23, 2022
PyElastica is the Python implementation of Elastica, an open-source software for the simulation of assemblies of slender, one-dimensional structures using Cosserat Rod theory.

PyElastica PyElastica is the python implementation of Elastica: an open-source project for simulating assemblies of slender, one-dimensional structure

Gazzola Lab 105 Jan 09, 2023
Julia package for contraction of tensor networks, based on the sweep line algorithm outlined in the paper General tensor network decoding of 2D Pauli codes

Julia package for contraction of tensor networks, based on the sweep line algorithm outlined in the paper General tensor network decoding of 2D Pauli codes

Christopher T. Chubb 35 Dec 21, 2022
Semantic Segmentation with Pytorch-Lightning

This is a simple demo for performing semantic segmentation on the Kitti dataset using Pytorch-Lightning and optimizing the neural network by monitoring and comparing runs with Weights & Biases.

Boris Dayma 58 Nov 18, 2022
Bayesian algorithm execution (BAX)

Bayesian Algorithm Execution (BAX) Code for the paper: Bayesian Algorithm Execution: Estimating Computable Properties of Black-box Functions Using Mut

Willie Neiswanger 38 Dec 08, 2022
This repository is for EMNLP 2021 paper: It is Not as Good as You Think! Evaluating Simultaneous Machine Translation on Interpretation Data

InterpretationData This repository is for our EMNLP 2021 paper: It is Not as Good as You Think! Evaluating Simultaneous Machine Translation on Interpr

4 Apr 21, 2022
Code for database and frontend of webpage for Neural Fields in Visual Computing and Beyond.

Neural Fields in Visual Computing—Complementary Webpage This is based on the amazing MiniConf project from Hendrik Strobelt and Sasha Rush—thank you!

Brown University Visual Computing Group 29 Nov 30, 2022
RL and distillation in CARLA using a factorized world model

World on Rails Learning to drive from a world on rails Dian Chen, Vladlen Koltun, Philipp Krähenbühl, arXiv techical report (arXiv 2105.00636) This re

Dian Chen 131 Dec 16, 2022
HiddenMarkovModel implements hidden Markov models with Gaussian mixtures as distributions on top of TensorFlow

Class HiddenMarkovModel HiddenMarkovModel implements hidden Markov models with Gaussian mixtures as distributions on top of TensorFlow 2.0 Installatio

Susara Thenuwara 2 Nov 03, 2021
PolyTrack: Tracking with Bounding Polygons

PolyTrack: Tracking with Bounding Polygons Abstract In this paper, we present a novel method called PolyTrack for fast multi-object tracking and segme

Gaspar Faure 13 Sep 15, 2022
Selecting Parallel In-domain Sentences for Neural Machine Translation Using Monolingual Texts

DataSelection-NMT Selecting Parallel In-domain Sentences for Neural Machine Translation Using Monolingual Texts Quick update: The paper got accepted o

Javad Pourmostafa 6 Jan 07, 2023
UniMoCo: Unsupervised, Semi-Supervised and Full-Supervised Visual Representation Learning

UniMoCo: Unsupervised, Semi-Supervised and Full-Supervised Visual Representation Learning This is the official PyTorch implementation for UniMoCo pape

dddzg 49 Jan 02, 2023
Graph Posterior Network: Bayesian Predictive Uncertainty for Node Classification (NeurIPS 2021)

Graph Posterior Network This is the official code repository to the paper Graph Posterior Network: Bayesian Predictive Uncertainty for Node Classifica

Maximilian Stadler 30 Dec 05, 2022
Character Grounding and Re-Identification in Story of Videos and Text Descriptions

Character in Story Identification Network (CiSIN) This project hosts the code for our paper. Youngjae Yu, Jongseok Kim, Heeseung Yun, Jiwan Chung and

8 Dec 09, 2022
Resilience from Diversity: Population-based approach to harden models against adversarial attacks

Resilience from Diversity: Population-based approach to harden models against adversarial attacks Requirements To install requirements: pip install -r

0 Nov 23, 2021
AI Virtual Calculator: This is a simple virtual calculator based on Artificial intelligence.

AI Virtual Calculator: This is a simple virtual calculator that works with gestures using OpenCV. We will use our hand in the air to click on the calc

Md. Rakibul Islam 1 Jan 13, 2022
A PyTorch-based library for semi-supervised learning

News If you want to join TorchSSL team, please e-mail Yidong Wang ([email protected]<

1k Jan 06, 2023
Official PyTorch implementation of Spatial Dependency Networks.

Spatial Dependency Networks: Neural Layers for Improved Generative Image Modeling Đorđe Miladinović   Aleksandar Stanić   Stefan Bauer   Jürgen Schmid

Djordje Miladinovic 34 Jan 19, 2022