Official implementation of our paper "Learning to Bootstrap for Combating Label Noise"

Related tags

Deep LearningL2B
Overview

Learning to Bootstrap for Combating Label Noise

This repo is the official implementation of our paper "Learning to Bootstrap for Combating Label Noise".

Citation

If you use this code for your research, please cite our paper "Learning to Bootstrap for Combating Label Noise".

@misc{zhou2022learning,
      title={Learning to Bootstrap for Combating Label Noise}, 
      author={Yuyin Zhou and Xianhang Li and Fengze Liu and Xuxi Chen and Lequan Yu and Cihang Xie and Matthew P. Lungren and Lei Xing},
      year={2022},
      eprint={2202.04291},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

Requirements

Python >= 3.6.4
Pytorch >= 1.6.0
Higher = 0.2.1
Tensorboardx = 2.4.1

Training

First, please create a folder to store checkpoints by using the following command.

mkdir checkpoint

CIFAR-10

To reproduce the results on CIFAR dataset from our paper, please follow the command and our hyper-parameters.

First, you can adjust the corruption_prob and corruption_type to obtain different noise rates and noise type.

Second, the reweight_label indicates you are using the our L2B method. You can change it to baseline or mixup.

python  main.py  --arch res18 --dataset cifar10 --num_classes 10 --exp L2B --train_batch_size  512 \
 --corruption_prob 0.2 --reweight_label  --lr 0.15  -clipping_norm 0.25  --num_epochs 300  --scheduler cos \
 --corruption_type unif  --warm_up 10  --seed 0  

CIFAR-100

Most of settings are the same as CIFAR-10. To reproduce the results, please follow the command.

python  main.py  --arch res18 --dataset cifar100 --num_classes 100 --exp L2B --train_batch_size  256  \
--corruption_prob 0.2 --reweight_label  --lr 0.15  --clipping_norm 0.80  --num_epochs 300  --scheduler cos \
--corruption_type unif  --warm_up 10  --seed 0 \ 

ISIC2019

On the ISIC dataset, first you should download the dataset by following command.

Download ISIC dataset as follows:
wget https://isic-challenge-data.s3.amazonaws.com/2019/ISIC_2019_Training_Input.zip
wget https://isic-challenge-data.s3.amazonaws.com/2019/ISIC_2019_Training_GroundTruth.csv \

Then you can reproduce the results by following the command.

python main.py  --arch res50  --dataset ISIC --data_path isic_data/ISIC_2019_Training_Input --num_classes 8 
--exp L2B  --train_batch_size 64  --corruption_prob 0.2 --lr 0.01 --clipping_norm 0.80 --num_epochs 30 
--temperature 10.0  --wd 5e-4  --scheduler cos --reweight_label --norm_type softmax --warm_up 1 

Clothing-1M

First, the num_batch and train_batch_size indicates how many training images you want to use (we sample a balanced training data for each epoch).

Second, you can adjust the num_meta to sample different numbers of validation images to form the metaset. We use the whole validation set as metaset by default.

The data_path is where you store the data and key-label lists. And also change the data_path in the line 20 of main.py. If you have issue for downloading the dataset, please feel free to contact us.

Then you can reproduce the results by following the command.

python main.py --arch res18_224 --num_batch 250 --dataset clothing1m \
--exp L2B_clothing1m_one_stage_multi_runs  --train_batch_size 256  --lr 0.005  \
--num_epochs 300  --reweight_label  --wd 5e-4 --scheduler cos   --warm_up 0 \
--data_path /data1/data/clothing1m/clothing1M  --norm_type org  --num_classes 14 \ 
--multi_runs 3 --num_meta 14313

Contact

Yuyin Zhou

Xianhang Li

If you have any question about the code and data, please contact us directly.

Implementation of Advantage-Weighted Regression: Simple and Scalable Off-Policy Reinforcement Learning

advantage-weighted-regression Implementation of Advantage-Weighted Regression: Simple and Scalable Off-Policy Reinforcement Learning, by Peng et al. (

Omar D. Domingues 1 Dec 02, 2021
Official code of CVPR 2021's PLOP: Learning without Forgetting for Continual Semantic Segmentation

PLOP: Learning without Forgetting for Continual Semantic Segmentation This repository contains all of our code. It is a modified version of Cermelli e

Arthur Douillard 116 Dec 14, 2022
SalGAN: Visual Saliency Prediction with Generative Adversarial Networks

SalGAN: Visual Saliency Prediction with Adversarial Networks Junting Pan Cristian Canton Ferrer Kevin McGuinness Noel O'Connor Jordi Torres Elisa Sayr

Image Processing Group - BarcelonaTECH - UPC 347 Nov 22, 2022
PyTorch implementations of algorithms for density estimation

pytorch-flows A PyTorch implementations of Masked Autoregressive Flow and some other invertible transformations from Glow: Generative Flow with Invert

Ilya Kostrikov 546 Dec 05, 2022
Pytorch implementation of winner from VQA Chllange Workshop in CVPR'17

2017 VQA Challenge Winner (CVPR'17 Workshop) pytorch implementation of Tips and Tricks for Visual Question Answering: Learnings from the 2017 Challeng

Mark Dong 166 Dec 11, 2022
PyTorch implementation of DCT fast weight RNNs

DCT based fast weights This repository contains the official code for the paper: Training and Generating Neural Networks in Compressed Weight Space. T

Kazuki Irie 4 Dec 24, 2022
Open-source implementation of Google Vizier for hyper parameters tuning

Advisor Introduction Advisor is the hyper parameters tuning system for black box optimization. It is the open-source implementation of Google Vizier w

tobe 1.5k Jan 04, 2023
A general-purpose encoder-decoder framework for Tensorflow

READ THE DOCUMENTATION CONTRIBUTING A general-purpose encoder-decoder framework for Tensorflow that can be used for Machine Translation, Text Summariz

Google 5.5k Jan 07, 2023
Differential Privacy for Heterogeneous Federated Learning : Utility & Privacy tradeoffs

Differential Privacy for Heterogeneous Federated Learning : Utility & Privacy tradeoffs In this work, we propose an algorithm DP-SCAFFOLD(-warm), whic

19 Nov 10, 2022
Face recognition system using MTCNN, FACENET, SVM and FAST API to track participants of Big Brother Brasil in real time.

BBB Face Recognizer Face recognition system using MTCNN, FACENET, SVM and FAST API to track participants of Big Brother Brasil in real time. Instalati

Rafael Azevedo 232 Dec 24, 2022
Source code of NeurIPS 2021 Paper ''Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration''

CaGCN This repo is for source code of NeurIPS 2021 paper "Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration". Paper L

6 Dec 19, 2022
Convert Table data to approximate values with GUI

Table_Editor Convert Table data to approximate values with GUIs... usage - Import methods for extension Tables. Imported method supposed to have only

CLJ 1 Jan 10, 2022
COD-Rank-Localize-and-Segment (CVPR2021)

COD-Rank-Localize-and-Segment (CVPR2021) Simultaneously Localize, Segment and Rank the Camouflaged Objects Full camouflage fixation training dataset i

JingZhang 52 Dec 20, 2022
Algebraic effect handlers in Python

PyEffect: Algebraic effects in Python What IDK. Usage effects.handle(operation, handlers=None) effects.set_handler(effect, handler) Supported effects

Greg Werbin 5 Dec 27, 2021
Self-training with Weak Supervision (NAACL 2021)

This repo holds the code for our weak supervision framework, ASTRA, described in our NAACL 2021 paper: "Self-Training with Weak Supervision"

Microsoft 148 Nov 20, 2022
Energy consumption estimation utilities for Jetson-based platforms

This repository contains a utility for measuring energy consumption when running various programs in NVIDIA Jetson-based platforms. Currently TX-2, NX, and AGX are supported.

OpenDR 10 Jun 17, 2022
DyNet: The Dynamic Neural Network Toolkit

The Dynamic Neural Network Toolkit General Installation C++ Python Getting Started Citing Releases and Contributing General DyNet is a neural network

Chris Dyer's lab @ LTI/CMU 3.3k Jan 06, 2023
using yolox+deepsort for object-tracker

YOLOX_deepsort_tracker yolox+deepsort实现目标跟踪 最新的yolox尝尝鲜~~(yolox正处在频繁更新阶段,因此直接链接yolox仓库作为子模块) Install Clone the repository recursively: git clone --rec

245 Dec 26, 2022
Deep Learning Package based on TensorFlow

White-Box-Layer is a Python module for deep learning built on top of TensorFlow and is distributed under the MIT license. The project was started in M

YeongHyeon Park 7 Dec 27, 2021
Code for "R-GCN: The R Could Stand for Random"

RR-GCN: Random Relational Graph Convolutional Networks PyTorch Geometric code for the paper "R-GCN: The R Could Stand for Random" RR-GCN is an extensi

PreDiCT.IDLab 31 Sep 07, 2022