Generic template to bootstrap your PyTorch project with PyTorch Lightning, Hydra, W&B, and DVC.

Overview

NN Template

PyTorch Lightning Conf: hydra Logging: wandb Conf: hydra Code style: black

Generic template to bootstrap your PyTorch project. Click on Use this Template and avoid writing boilerplate code for:

  • PyTorch Lightning, lightweight PyTorch wrapper for high-performance AI research.
  • Hydra, a framework for elegantly configuring complex applications.
  • DVC, track large files, directories, or ML models. Think "Git for data".
  • Weights and Biases, organize and analyze machine learning experiments. (educational account available)

nn-template is opinionated so you don't have to be. If you use this template, please add to your README.

Usage Examples

Checkout the mwe branch to view a minimum working example on MNIST.

Structure

.
├── conf                # Hydra compositional config
│   ├── default.yaml    # current experiment configuration
│   ├── data
│   ├── hydra
│   ├── logging
│   ├── model
│   ├── optim
│   └── train
├── data                # datasets
├── experiments         # local logs
├── README.md
├── requirements.txt    # basic requirements
└── src
    ├── common          # common Python modules
    ├── pl_data         # PyTorch Lightning datamodules and datasets
    ├── pl_modules      # PyTorch Lightning modules
    └── run.py          # entry point to run current conf

Data Version Control

DVC runs alongside git and uses the current commit hash to version control the data.

Initialize the dvc repository:

$ dvc init

To start tracking a file or directory, use dvc add:

$ dvc add data/ImageNet

DVC stores information about the added file (or a directory) in a special .dvc file named data/ImageNet.dvc, a small text file with a human-readable format. This file can be easily versioned like source code with Git, as a placeholder for the original data (which gets listed in .gitignore):

git add data/ImageNet.dvc data/.gitignore
git commit -m "Add raw data"

Making changes

When you make a change to a file or directory, run dvc add again to track the latest version:

$ dvc add data/ImageNet

Switching between versions

The regular workflow is to use git checkout first to switch a branch, checkout a commit, or a revision of a .dvc file, and then run dvc checkout to sync data:

$ git checkout <...>
$ dvc checkout

Read more in the docs!

Weights and Biases

Weights & Biases helps you keep track of your machine learning projects. Use tools to log hyperparameters and output metrics from your runs, then visualize and compare results and quickly share findings with your colleagues.

This is an example of a simple dashboard.

Quickstart

Login to your wandb account, running once wandb login. Configure the logging in conf/logging/*.


Read more in the docs. Particularly useful the log method, accessible from inside a PyTorch Lightning module with self.logger.experiment.log.

W&B is our logger of choice, but that is a purely subjective decision. Since we are using Lightning, you can replace wandb with the logger you prefer (you can even build your own). More about Lightning loggers here.

Hydra

Hydra is an open-source Python framework that simplifies the development of research and other complex applications. The key feature is the ability to dynamically create a hierarchical configuration by composition and override it through config files and the command line. The name Hydra comes from its ability to run multiple similar jobs - much like a Hydra with multiple heads.

The basic functionalities are intuitive: it is enough to change the configuration files in conf/* accordingly to your preferences. Everything will be logged in wandb automatically.

Consider creating new root configurations conf/myawesomeexp.yaml instead of always using the default conf/default.yaml.

Sweeps

You can easily perform hyperparameters sweeps, which override the configuration defined in /conf/*.

The easiest one is the grid-search. It executes the code with every possible combinations of the specified hyperparameters:

PYTHONPATH=. python src/run.py -m optim.optimizer.lr=0.02,0.002,0.0002 optim.lr_scheduler.T_mult=1,2 optim.optimizer.weight_decay=0,1e-5

You can explore aggregate statistics or compare and analyze each run in the W&B dashboard.


We recommend to go through at least the Basic Tutorial, and the docs about Instantiating objects with Hydra.

PyTorch Lightning

Lightning makes coding complex networks simple. It is not a high level framework like keras, but forces a neat code organization and encapsulation.

You should be somewhat familiar with PyTorch and PyTorch Lightning before using this template.

Environment Variables

System specific variables (e.g. absolute paths to datasets) should not be under version control, otherwise there will be conflicts between different users.

The best way to handle system specific variables is through environment variables.

You can define new environment variables in a .env file in the project root. A copy of this file (e.g. .env.template) can be under version control to ease new project configurations.

To define a new variable write inside .env:

export MY_VAR=/home/user/my_system_path

You can dynamically resolve the variable name from Python code with:

get_env('MY_VAR')

and in the Hydra .yaml configuration files with:

${env:MY_VAR}
You might also like...
Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch
Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch

Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch

A clean and scalable template to kickstart your deep learning project 🚀 ⚡ 🔥
A clean and scalable template to kickstart your deep learning project 🚀 ⚡ 🔥

Lightning-Hydra-Template A clean and scalable template to kickstart your deep learning project 🚀 ⚡ 🔥 Click on Use this template to initialize new re

PyTorch implementation of EGVSR: Efficcient & Generic Video Super-Resolution (VSR)
PyTorch implementation of EGVSR: Efficcient & Generic Video Super-Resolution (VSR)

This is a PyTorch implementation of EGVSR: Efficcient & Generic Video Super-Resolution (VSR), using subpixel convolution to optimize the inference speed of TecoGAN VSR model. Please refer to the official implementation ESPCN and TecoGAN for more information.

Official implementation of our paper "Learning to Bootstrap for Combating Label Noise"

Learning to Bootstrap for Combating Label Noise This repo is the official implementation of our paper "Learning to Bootstrap for Combating Label Noise

An essential implementation of BYOL in PyTorch + PyTorch Lightning
An essential implementation of BYOL in PyTorch + PyTorch Lightning

Essential BYOL A simple and complete implementation of Bootstrap your own latent: A new approach to self-supervised Learning in PyTorch + PyTorch Ligh

A general framework for deep learning experiments under PyTorch based on pytorch-lightning

torchx Torchx is a general framework for deep learning experiments under PyTorch based on pytorch-lightning. TODO list gan-like training wrapper text

a generic C++ library for image analysis

VIGRA Computer Vision Library Copyright 1998-2013 by Ullrich Koethe This file is part of the VIGRA computer vision library. You may use,

Generic Event Boundary Detection: A Benchmark for Event Segmentation

Generic Event Boundary Detection: A Benchmark for Event Segmentation We release our data annotation & baseline codes for detecting generic event bound

[ICCV2021] IICNet: A Generic Framework for Reversible Image Conversion
[ICCV2021] IICNet: A Generic Framework for Reversible Image Conversion

IICNet - Invertible Image Conversion Net Official PyTorch Implementation for IICNet: A Generic Framework for Reversible Image Conversion (ICCV2021). D

Comments
  • Curious if you checked out DAGsHub

    Curious if you checked out DAGsHub

    Hi @lucmos, this looks like an awesome repo. I stumbled on it while doing some research on project templates for ML projects. I'm one of the creators of DAGsHub which is a platform built on Git, DVC, and MLflow. It integrates with GitHub and provides a free DVC remote and MLflow server so that you can track experiments and share your data & models in one UI.

    Here's an example project to showcase the abilities: https://dagshub.com/OperationSavta/SavtaDepth

    It seems really in line with what you're creating here, and I would love to hear your thoughts about it.

    opened by deanp70 4
  • Streamlit UI - Weights and Biases login

    Streamlit UI - Weights and Biases login

    The template is really awesome.

    I had a small issue. When I run the Streamlit UI without being logged in in Weights and Biases. The UI just hanged with the loading status without giving me any feedback about what was happening, so that I have to log in first in wandb. I had to login manually from the console. Is there any way to solve this issue? For example to have feedback from the UI if I'm not logged in.

    Thanks!

    opened by andreim14 1
  • load_model

    load_model

    Hi, This issue concerns the function from nn_core.serialization import load_model Suppose i train a pytorch model with class MyLightningModule, and that I saved the checkpoint in model_path. Suppose now that the class MyLightningModule has received some minor changes, like a new class variable has been added. Let's call this version MyLightningModuleV2. When I load a model using this function, like:

    self.model = load_model(module_class=MyLightningModuleOld, checkpoint_path=Path(model_path), map_location=self.device).to(self.device).eval()

    I get an error because the chekpoint refers to the model of class MyLightningModule and therefore the new variable is (obviously) missing. To make it work, i need to load the model with the old version of the class, that is, MyLightningModule, and then manually setting "model.new_variable" to the value i want to get, like the following:

    self.model = load_model(module_class=MyLightningModuleOld, checkpoint_path=Path(model_path), map_location=self.device).to(self.device).eval()
    self.model.new_variable = False
    

    It would be nice to have this option in the load_model function to avoid creating multiple versions of the same class.

    opened by framolfese 0
  • PyTorch Lightning EcoCI integration to check for compatibility with latest & upcoming releases

    PyTorch Lightning EcoCI integration to check for compatibility with latest & upcoming releases

    Hey Valentino & Luca,

    I am just catching up with some bookmarks and remembered your repo here :). As someone who constantly fuzzes about the ideal project structure, that's actually pretty cool. I have been using an adapted version of the data science cookiecutter for generic ML projects, but nothing sophisticated like this here with code stubs.

    Haven't thoroughly played with it yet, though, besides creating an example folder and looking at the pl_module.py and datamodule.py files, which look good to me!

    In any case, long story short, I was wondering if you'd be interested in the PyTorch Lightning's ecosystem CI to make sure that it stays fresh and relevant wrt to upcoming version releases (comes with free CPU and multi-GPU CI tests): https://devblog.pytorchlightning.ai/stay-ahead-of-breaking-changes-with-the-new-lightning-ecosystem-ci-b7e1cf78a6c7

    If you are interested in that, I am sure my colleague @Borda would be happy to assist with questions & technical details -- he built this thing, so he probably knows best :)

    opened by rasbt 4
Releases(0.2.3)
  • 0.2.3(Dec 15, 2022)

    What's Changed

    • Bump dependency versions by @lucmos in https://github.com/grok-ai/nn-template/pull/79
    • Version 0.2.3 by @lucmos in https://github.com/grok-ai/nn-template/pull/80

    Full Changelog: https://github.com/grok-ai/nn-template/compare/0.2.2...0.2.3

    Source code(tar.gz)
    Source code(zip)
  • 0.2.2(Jun 13, 2022)

    What's Changed

    • Update README.md by @Flegyas in https://github.com/grok-ai/nn-template/pull/70
    • Improve documentation by @Flegyas in https://github.com/grok-ai/nn-template/pull/71
    • Update documentation by @Flegyas in https://github.com/grok-ai/nn-template/pull/72
    • Add asciinema gif in the README and docs by @lucmos in https://github.com/grok-ai/nn-template/pull/74
    • Add papers by @lucmos in https://github.com/grok-ai/nn-template/pull/76
    • Update precommits versions by @lucmos in https://github.com/grok-ai/nn-template/pull/75
    • Version 0.2.2 by @lucmos in https://github.com/grok-ai/nn-template/pull/77

    Full Changelog: https://github.com/grok-ai/nn-template/compare/0.2.1...0.2.2

    Source code(tar.gz)
    Source code(zip)
  • 0.2.1(Mar 1, 2022)

    Changelog for nn-template 0.2.1 (2022-03-01)

    What's Changed

    • Fix status badge in the documentation by @lucmos in https://github.com/grok-ai/nn-template/pull/64
    • Minor fixes post release by @lucmos in https://github.com/grok-ai/nn-template/pull/65
    • Fix typos in the documentation by @mikcnt in https://github.com/grok-ai/nn-template/pull/67
    • Fix broken relative links due to mike root folder by @lucmos in https://github.com/grok-ai/nn-template/pull/68
    • Version 0.2.1 by @lucmos in https://github.com/grok-ai/nn-template/pull/69

    New Contributors

    • @mikcnt made their first contribution in https://github.com/grok-ai/nn-template/pull/67

    Full Changelog: https://github.com/grok-ai/nn-template/compare/0.2.0...0.2.1

    Source code(tar.gz)
    Source code(zip)
  • 0.2.0(Mar 1, 2022)

    We are very pleased to present you NN Template 0.2.0!

    Changelog for nn-template 0.2.0 (2022-03-01)

    Summary

    • Cookiecutter parametrization
    • CI/CD Integration via GitHub Actions
    • Automate testing of your projects
    • Logic decoupling thanks to nn-template-core
    • Advanced restore options for trainings
    • Documentation website
    • Support for Python logging (with colors!)

    What's Changed

    • Refactor configuration by @lucmos in https://github.com/grok-ai/nn-template/pull/8
    • Refactor project to a python package by @lucmos in https://github.com/grok-ai/nn-template/pull/10
    • Add tooling configuration by @lucmos in https://github.com/grok-ai/nn-template/pull/9
    • Refactor codebase to be compliant to the pre-commits by @lucmos in https://github.com/grok-ai/nn-template/pull/11
    • Refactor the project root management by @lucmos in https://github.com/grok-ai/nn-template/pull/12
    • Added wandb to .gitignore by @Flegyas in https://github.com/grok-ai/nn-template/pull/14
    • Refactor logging by @lucmos in https://github.com/grok-ai/nn-template/pull/15
    • Enable pin-memory if not on CPU by @lucmos in https://github.com/grok-ai/nn-template/pull/16
    • Factor our PyTorch Module from the Lightning Module by @lucmos in https://github.com/grok-ai/nn-template/pull/17
    • Force the .cache folder to be in the PROJECT_ROOT by @lucmos in https://github.com/grok-ai/nn-template/pull/19
    • Add the configuration to the Lightning checkpoints by @lucmos in https://github.com/grok-ai/nn-template/pull/20
    • Use extend-ignore instead of ignore in .flake8 by @lucmos in https://github.com/grok-ai/nn-template/pull/21
    • Fix formatting by @lucmos in https://github.com/grok-ai/nn-template/pull/22
    • Log the code used in the current experiment to wandb by @lucmos in https://github.com/grok-ai/nn-template/pull/18
    • Functionalities decoupling via external library (nn-core). by @Flegyas in https://github.com/grok-ai/nn-template/pull/23
    • Add tests by @lucmos in https://github.com/grok-ai/nn-template/pull/24
    • Implement resuming behaviour by @lucmos in https://github.com/grok-ai/nn-template/pull/25
    • Refactor NNLogger usages by @lucmos in https://github.com/grok-ai/nn-template/pull/27
    • Add CI on pre-commits and tests by @lucmos in https://github.com/grok-ai/nn-template/pull/26
    • Remove some trigger from the Test Suite workflow by @lucmos in https://github.com/grok-ai/nn-template/pull/28
    • Overwrite Lightning logging configuration by @lucmos in https://github.com/grok-ai/nn-template/pull/29
    • Ensure tags are defined asking interactively for them by @lucmos in https://github.com/grok-ai/nn-template/pull/30
    • Introduce the seed index concept by @lucmos in https://github.com/grok-ai/nn-template/pull/31
    • Force execution of init.py on direct execution by @lucmos in https://github.com/grok-ai/nn-template/pull/33
    • Move functions from template to core by @lucmos in https://github.com/grok-ai/nn-template/pull/34
    • Add functionality to upload the run files in the storage to wandb by @lucmos in https://github.com/grok-ai/nn-template/pull/35
    • Move ui_utils entirely to nn-core by @lucmos in https://github.com/grok-ai/nn-template/pull/36
    • Add dynamic parametrized badges for the Test Suite and docs by @lucmos in https://github.com/grok-ai/nn-template/pull/45
    • Fix files hashing in workflow cache keys by @lucmos in https://github.com/grok-ai/nn-template/pull/46
    • Add seed_index determinism test by @lucmos in https://github.com/grok-ai/nn-template/pull/44
    • Refactor references to organization name into grok-ai by @lucmos in https://github.com/grok-ai/nn-template/pull/48
    • Push the default version in mike on release by @lucmos in https://github.com/grok-ai/nn-template/pull/49
    • Improve docs status badge to monitor the github-pages environment by @lucmos in https://github.com/grok-ai/nn-template/pull/50
    • Fix mike rebasing and pushing logic on release by @lucmos in https://github.com/grok-ai/nn-template/pull/51
    • Add a DAG in the post hook interactive setup by @lucmos in https://github.com/grok-ai/nn-template/pull/47
    • Skip test if no dataset is provided by @Flegyas in https://github.com/grok-ai/nn-template/pull/52
    • Fix remote parametrization in the README by @lucmos in https://github.com/grok-ai/nn-template/pull/53
    • Fix type hint in dataset.py by @lucmos in https://github.com/grok-ai/nn-template/pull/55
    • Improve the "add git remote" message in the post hook by @lucmos in https://github.com/grok-ai/nn-template/pull/54
    • Update nn-template-core dependency to 0.0.7 by @lucmos in https://github.com/grok-ai/nn-template/pull/56
    • Update docs by @lucmos in https://github.com/grok-ai/nn-template/pull/57
    • Add custom collate function by @Flegyas in https://github.com/grok-ai/nn-template/pull/58
    • Set metadata as a cached property in DataModule by @Flegyas in https://github.com/grok-ai/nn-template/pull/59
    • Pass run tags to the WandbLogger by @Flegyas in https://github.com/grok-ai/nn-template/pull/60
    • Feature/bump core by @Flegyas in https://github.com/grok-ai/nn-template/pull/61
    • Version 0.2.0 by @Flegyas in https://github.com/grok-ai/nn-template/pull/62

    Full Changelog: https://github.com/grok-ai/nn-template/compare/0.1.0...0.2.0

    Source code(tar.gz)
    Source code(zip)
Owner
Luca Moschella
PhD student at University of Rome La Sapienza in Computer Science.
Luca Moschella
Code for TIP 2017 paper --- Illumination Decomposition for Photograph with Multiple Light Sources.

Illumination_Decomposition Code for TIP 2017 paper --- Illumination Decomposition for Photograph with Multiple Light Sources. This code implements the

QAY 7 Nov 15, 2020
ML-PersonalWork - Big assignment PersonalWork in Machine Learning, 2021 autumn BUAA.

ML-PersonalWork - Big assignment PersonalWork in Machine Learning, 2021 autumn BUAA.

Snapdragon Lee 2 Dec 16, 2022
Open source repository for the code accompanying the paper 'PatchNets: Patch-Based Generalizable Deep Implicit 3D Shape Representations'.

PatchNets This is the official repository for the project "PatchNets: Patch-Based Generalizable Deep Implicit 3D Shape Representations". For details,

16 May 22, 2022
PyTorch implementation of "Conformer: Convolution-augmented Transformer for Speech Recognition" (INTERSPEECH 2020)

PyTorch implementation of Conformer: Convolution-augmented Transformer for Speech Recognition. Transformer models are good at capturing content-based

Soohwan Kim 565 Jan 04, 2023
The implementation for the SportsCap (IJCV 2021)

SportsCap: Monocular 3D Human Motion Capture and Fine-grained Understanding in Challenging Sports Videos ProjectPage | Paper | Video | Dataset (Part01

Chen Xin 79 Dec 16, 2022
TensorFlow2 Classification Model Zoo playing with TensorFlow2 on the CIFAR-10 dataset.

Training CIFAR-10 with TensorFlow2(TF2) TensorFlow2 Classification Model Zoo. I'm playing with TensorFlow2 on the CIFAR-10 dataset. Architectures LeNe

Chia-Hung Yuan 16 Sep 27, 2022
Cleaned up code for DSTC 10: SIMMC 2.0 track: subtask 2: multimodal coreference resolution

UNITER-Based Situated Coreference Resolution with Rich Multimodal Input: arXiv MMCoref_cleaned Code for the MMCoref task of the SIMMC 2.0 dataset. Pre

Yichen (William) Huang 2 Dec 05, 2022
A simple rest api serving a deep learning model that classifies human gender based on their faces. (vgg16 transfare learning)

this is a simple rest api serving a deep learning model that classifies human gender based on their faces. (vgg16 transfare learning)

crispengari 5 Dec 09, 2021
StyleSwin: Transformer-based GAN for High-resolution Image Generation

StyleSwin This repo is the official implementation of "StyleSwin: Transformer-based GAN for High-resolution Image Generation". By Bowen Zhang, Shuyang

Microsoft 349 Dec 28, 2022
MLOps will help you to understand how to build a Continuous Integration and Continuous Delivery pipeline for an ML/AI project.

page_type languages products description sample python azure azure-machine-learning-service azure-devops Code which demonstrates how to set up and ope

1 Nov 01, 2021
Deploy pytorch classification model using Flask and Streamlit

Deploy pytorch classification model using Flask and Streamlit

Ben Seo 1 Nov 17, 2021
On Out-of-distribution Detection with Energy-based Models

On Out-of-distribution Detection with Energy-based Models This repository contains the code for the experiments conducted in the paper On Out-of-distr

Sven 19 Aug 07, 2022
PyTorch Implementation for AAAI'21 "Do Response Selection Models Really Know What's Next? Utterance Manipulation Strategies for Multi-turn Response Selection"

UMS for Multi-turn Response Selection Implements the model described in the following paper Do Response Selection Models Really Know What's Next? Utte

Taesun Whang 47 Nov 22, 2022
[NeurIPS 2021] Large Scale Learning on Non-Homophilous Graphs: New Benchmarks and Strong Simple Methods

Large Scale Learning on Non-Homophilous Graphs: New Benchmarks and Strong Simple Methods Large Scale Learning on Non-Homophilous Graphs: New Benchmark

60 Jan 03, 2023
codes for paper Combining Dynamic Local Context Focus and Dependency Cluster Attention for Aspect-level sentiment classification

DLCF-DCA codes for paper Combining Dynamic Local Context Focus and Dependency Cluster Attention for Aspect-level sentiment classification. submitted t

15 Aug 30, 2022
A Simplied Framework of GAN Inversion

Framework of GAN Inversion Introcuction You can implement your own inversion idea using our repo. We offer a full range of tuning settings (in hparams

Kangneng Zhou 13 Sep 27, 2022
Local trajectory planner based on a multilayer graph framework for autonomous race vehicles.

Graph-Based Local Trajectory Planner The graph-based local trajectory planner is python-based and comes with open interfaces as well as debug, visuali

TUM - Institute of Automotive Technology 160 Jan 04, 2023
Deep learning toolbox based on PyTorch for hyperspectral data classification.

Deep learning toolbox based on PyTorch for hyperspectral data classification.

Nicolas 304 Dec 28, 2022
PyTorch implementation of our ICCV 2019 paper: Liquid Warping GAN: A Unified Framework for Human Motion Imitation, Appearance Transfer and Novel View Synthesis

Impersonator PyTorch implementation of our ICCV 2019 paper: Liquid Warping GAN: A Unified Framework for Human Motion Imitation, Appearance Transfer an

SVIP Lab 1.7k Jan 06, 2023
Official PyTorch Implementation of "Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous Graphs". NeurIPS 2020.

Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous Graphs This repository is the implementation of SELAR. Dasol Hwang* , Jinyoung Pa

MLV Lab (Machine Learning and Vision Lab at Korea University) 48 Nov 09, 2022