LyaNet: A Lyapunov Framework for Training Neural ODEs

Overview

LyaNet: A Lyapunov Framework for Training Neural ODEs

Provide the model type--config-name to train and test models configured as those shown in the paper.

Classification Training

For the code assumes the project root is the current directory.

Example commands:

python sl_pipeline.py --config-name classical +dataset=MNIST

Tensorboards are saved to run_data/tensorboards and can be viewed by running:

tensorboard --logdir ./run_data/tensorboards --reload_multifile True

Only the model with the best validation error is saved. To quickly verify the the test error of this model, run the adversarial robustness script. It prints the nominal test error before performing the attack.

Adversarial Robustness

Assuming the current directory is robustness. Notice that the model file name will be different depending on the dataset and model combination you have run. The path provided should provide an idea of the directory structure where models are stored.

These scripts will print the testing error, followed by the testing error with and adversarial attack. Notice adversarial testing requires significantly more resources.

L2 Adversarial robustness experiments

PYTHONPATH=../ python untargeted_robustness.py --config-name classical norm="2" \
+dataset=MNIST \
"+model_file='../run_data/tensorboards/d.MNIST_m.ClassicalModule(RESNET18)_b.128_lr.0.01_wd.0.0001_mepoch120._sd0/default/version_0/checkpoints/epoch=7-step=3375.ckpt'"

L Infinity Adversarial robustness experiments

PYTHONPATH=../ python untargeted_robustness.py --config-name classical \
norm="inf"  +dataset=MNIST \
"+model_file='../run_data/tensorboards/d.MNIST_m.ClassicalModule(RESNET18)_b.128_lr.0.01_wd.0.0001_mepoch120._sd0/default/version_0/checkpoints/epoch=7-step=3375.ckpt'"

Datasets supported

  • MNIST
  • FashionMNIST
  • CIFAR10
  • CIFAR100

Models Supported

  • anode : Data-controlled dynamics with ResNet18 Component trained through solution differentiation
  • classical: ResNet18
  • lyapunov: Data-controlled dynamics with ResNet18 Component trained with LyaNet
  • continuous_net: ContinuousNet from [1] trained through solution differentiation
  • continuous_net_lyapunov: ContinuousNet from [1] trained with LyaNet

References

  1. Continuous-in-Depth Neural Networks Code
  2. Learning by Turning: Neural Architecture Aware Optimisation Code
Owner
Ivan Dario Jimenez Rodriguez
Ivan Dario Jimenez Rodriguez
RCT-ART is an NLP pipeline built with spaCy for converting clinical trial result sentences into tables through jointly extracting intervention, outcome and outcome measure entities and their relations.

Randomised controlled trial abstract result tabulator RCT-ART is an NLP pipeline built with spaCy for converting clinical trial result sentences into

2 Sep 16, 2022
Weighted QMIX: Expanding Monotonic Value Function Factorisation

This repo contains the cleaned-up code that was used in "Weighted QMIX: Expanding Monotonic Value Function Factorisation"

whirl 82 Dec 29, 2022
Technical experimentations to beat the stock market using deep learning :chart_with_upwards_trend:

DeepStock Technical experimentations to beat the stock market using deep learning. Experimentations Deep Learning Stock Prediction with Daily News Hea

Keon 449 Dec 29, 2022
NU-Wave: A Diffusion Probabilistic Model for Neural Audio Upsampling

NU-Wave: A Diffusion Probabilistic Model for Neural Audio Upsampling For Official repo of NU-Wave: A Diffusion Probabilistic Model for Neural Audio Up

Rishikesh (ऋषिकेश) 38 Oct 11, 2022
Physics-Aware Training (PAT) is a method to train real physical systems with backpropagation.

Physics-Aware Training (PAT) is a method to train real physical systems with backpropagation. It was introduced in Wright, Logan G. & Onodera, Tatsuhiro et al. (2021)1 to train Physical Neural Networ

McMahon Lab 230 Jan 05, 2023
A multi-scale unsupervised learning for deformable image registration

A multi-scale unsupervised learning for deformable image registration Shuwei Shao, Zhongcai Pei, Weihai Chen, Wentao Zhu, Xingming Wu and Baochang Zha

ShuweiShao 2 Apr 13, 2022
Taking A Closer Look at Domain Shift: Category-level Adversaries for Semantics Consistent Domain Adaptation

Taking A Closer Look at Domain Shift: Category-level Adversaries for Semantics Consistent Domain Adaptation (CVPR2019) This is a pytorch implementatio

Yawei Luo 280 Jan 01, 2023
Autoencoders pretraining using clustering

Autoencoders pretraining using clustering

IITiS PAN 2 Dec 16, 2021
Final project for machine learning (CSC 590). Detection of hepatitis C and progression through blood samples.

Hepatitis C Blood Based Detection Final project for machine learning (CSC 590). Dataset from Kaggle. Using data from previous hepatitis C blood panels

Jennefer Maldonado 1 Dec 28, 2021
Neural Style and MSG-Net

PyTorch-Style-Transfer This repo provides PyTorch Implementation of MSG-Net (ours) and Neural Style (Gatys et al. CVPR 2016), which has been included

Hang Zhang 904 Dec 21, 2022
Randomizes the warps in a stock pokeemerald repo.

pokeemerald warp randomizer Randomizes the warps in a stock pokeemerald repo. Usage Instructions Install networkx and matplotlib via pip3 or similar.

Max Thomas 6 Mar 17, 2022
The Pytorch implementation for "Video-Text Pre-training with Learned Regions"

Region_Learner The Pytorch implementation for "Video-Text Pre-training with Learned Regions" (arxiv) We are still cleaning up the code further and pre

Rui Yan 0 Mar 20, 2022
Reusable constraint types to use with typing.Annotated

annotated-types PEP-593 added typing.Annotated as a way of adding context-specific metadata to existing types, and specifies that Annotated[T, x] shou

125 Dec 26, 2022
End-To-End Crowdsourcing

End-To-End Crowdsourcing Comparison of traditional crowdsourcing approaches to a state-of-the-art end-to-end crowdsourcing approach LTNet on sentiment

Andreas Koch 1 Mar 06, 2022
pytorch implementation of ABC : Auxiliary Balanced Classifier for Class-imbalanced Semi-supervised Learning

ABC:Auxiliary Balanced Classifier for Class-imbalanced Semi-supervised Learning, NeurIPS 2021 pytorch implementation of ABC : Auxiliary Balanced Class

Hyuck Lee 25 Dec 22, 2022
Accurate Phylogenetic Inference with Symmetry-Preserving Neural Networks

Accurate Phylogenetic Inference with a Symmetry-preserving Neural Network Model Claudia Solis-Lemus Shengwen Yang Leonardo Zepeda-Núñez This repositor

Leonardo Zepeda-Núñez 2 Feb 11, 2022
Material related to the Principles of Cloud Computing course.

CloudComputingCourse Material related to the Principles of Cloud Computing course. This repository comprises material that I use to teach my Principle

Aniruddha Gokhale 15 Dec 02, 2022
Official implementation of the Neurips 2021 paper Searching Parameterized AP Loss for Object Detection.

Parameterized AP Loss By Chenxin Tao, Zizhang Li, Xizhou Zhu, Gao Huang, Yong Liu, Jifeng Dai This is the official implementation of the Neurips 2021

46 Jul 06, 2022
FlowTorch is a PyTorch library for learning and sampling from complex probability distributions using a class of methods called Normalizing Flows

FlowTorch is a PyTorch library for learning and sampling from complex probability distributions using a class of methods called Normalizing Flows.

Meta Incubator 272 Jan 02, 2023
Pywonderland - A tour in the wonderland of math with python.

A Tour in the Wonderland of Math with Python A collection of python scripts for drawing beautiful figures and animating interesting algorithms in math

Zhao Liang 4.1k Jan 03, 2023