I-SECRET: Importance-guided fundus image enhancement via semi-supervised contrastive constraining

Related tags

Deep LearningISECRET
Overview

I-SECRET

This is the implementation of the MICCAI 2021 Paper "I-SECRET: Importance-guided fundus image enhancement via semi-supervised contrastive constraining".

Data preparation

  1. Firstly, download EyeQ dataset from EyeQ.
  2. Split the dataset into train/val/test according to the EyePACS challenge.
  3. Run
python tools/degrade_eyeq.py --degrade_dir ${DATA_PATH}$ --output_dir $OUTPUT_PATH$ --mask_dir ${MASK_PATH}$ --gt_dir ${GT_PATH}$.

Note that this scipt should be applied for usable dataset for cropping pre-processing.

  1. Make the architecture of the EyeQ directory as:
.
├── 
├── train
│   └── crop_good
│   └── degrade_good
│   └── crop_usable
├── val
│   └── crop_good
│   └── degrade_good
│   └── crop_usable
├── test
│   └── crop_good
│   └── degrade_good
│   └── crop_usable

Here, the crop_good is the ${GT_PATH}$ in the step 3, and degrade_good is the ${OUTPUT_PATH}$ in the step 3.

Package install

Run

pip install -r requirements.txt

Run pipeline

Run the baseline model

python main.py --model i-secret --lambda_rec 1 --lambda_gan 1 --data_root_dir ${DATA_DIR}$ --gpu ${GPU_INDEXS}$ -- batch size {BATCH_SIZE}$  --name baseline --experiment_root_dir ${LOG_DIR}$

Run the model with IS-loss

python main.py --model i-secret --lambda_is 1 --lambda_gan 1 --data_root_dir ${DATA_DIR}$ --gpu ${GPU_INDEXS}$ -- batch size {BATCH_SIZE}$  --name is_loss --experiment_root_dir ${LOG_DIR}$

Run the I-SECRET model

python main.py --model i-secret --lambda_is 1 --lambda_icc 1 --lambda_gan 1 --data_root_dir ${DATA_DIR}$ --gpu ${GPU_INDEXS}$ -- batch size {BATCH_SIZE}$  --name i-secret --experiment_root_dir ${LOG_DIR}$

Visualization

Go to the ${LOG_DIR}$ / ${EXPERIMENT_NAME}$ / checkpoint, run

tensorboard --logdir ./ --port ${PORT}$

then go to localhost:${PORT}$ for detailed logging and visualization.

Test and evalutation

Run

python main.py --test --resume 0 --test_dir ${INPUT_PATH}$ --output_dir ${OUTPUT_PATH}$ --name ${EXPERIMENT_NAME}$ --gpu ${GPU_INDEXS}$ -- batch size {BATCH_SIZE}$ 

Please note that the metric outputted by test script is under the PyTorch pre-process (resize etc.). It is not precise. Therefore, we need to run the evaluation scipt for further evaluation.

python tools/evaluate.py --test_dir ${OUTPUT_PATH}$ --gt_dir ${GT_PATH}$

Vessel segmentation

We apply the iter-Net framework. We simply replace the test set with the degraded images/enhanced images. For more details, please follow IterNet.

Future Plan

  • Cleaning codes
  • More SOTA backbones (ResNest ...)
  • WGAN loss
  • Internal evaluations for down-sampling tasks

Acknowledgment

Thanks for CutGAN for the implementation of patch NCE loss, EyeQ_Enhancement for degradation codes, Slowfast for the distributed training codes

Proposal, Tracking and Segmentation (PTS): A Cascaded Network for Video Object Segmentation

Proposal, Tracking and Segmentation (PTS): A Cascaded Network for Video Object Segmentation By Qiang Zhou*, Zilong Huang*, Lichao Huang, Han Shen, Yon

Forest 117 Apr 01, 2022
CNN designed for pansharpening

PROGRESSIVE BAND-SEPARATED CONVOLUTIONAL NEURAL NETWORK FOR MULTISPECTRAL PANSHARPENING This repository contains main code for the paper PROGRESSIVE B

SerendipitysX 3 Dec 29, 2021
WSDM2022 "A Simple but Effective Bidirectional Extraction Framework for Relational Triple Extraction"

BiRTE WSDM2022 "A Simple but Effective Bidirectional Extraction Framework for Relational Triple Extraction" Requirements The main requirements are: py

9 Dec 27, 2022
A Fast Monotone Rotating Shallow Water model

pyRSW A Fast Monotone Rotating Shallow Water model How fast? As fast as a sustained 2 Gflop/s per core on a 2.5 GHz cpu (or 2048 Gflop/s with 1024 cor

Guillaume Roullet 13 Sep 28, 2022
Code for the paper "Attention Approximates Sparse Distributed Memory"

Attention Approximates Sparse Distributed Memory - Codebase This is all of the code used to run analyses in the paper "Attention Approximates Sparse D

Trenton Bricken 14 Dec 05, 2022
DeeBERT: Dynamic Early Exiting for Accelerating BERT Inference

DeeBERT This is the code base for the paper DeeBERT: Dynamic Early Exiting for Accelerating BERT Inference. Code in this repository is also available

Castorini 132 Nov 14, 2022
Portfolio analytics for quants, written in Python

QuantStats: Portfolio analytics for quants QuantStats Python library that performs portfolio profiling, allowing quants and portfolio managers to unde

Ran Aroussi 2.7k Jan 08, 2023
A unified framework to jointly model images, text, and human attention traces.

connect-caption-and-trace This repository contains the reference code for our paper Connecting What to Say With Where to Look by Modeling Human Attent

Meta Research 73 Oct 24, 2022
My personal code and solution to the Synacor Challenge from 2012 OSCON.

Synacor OSCON Challenge Solution (2012) This repository contains my code and solution to solve the Synacor OSCON 2012 Challenge. If you are interested

2 Mar 20, 2022
Understanding Hyperdimensional Computing for Parallel Single-Pass Learning

Understanding Hyperdimensional Computing for Parallel Single-Pass Learning Authors: Tao Yu* Yichi Zhang* Zhiru Zhang Christopher De Sa *: Equal Contri

Cornell RelaxML 4 Sep 08, 2022
Set of models for classifcation of 3D volumes

Classification models 3D Zoo - Keras and TF.Keras This repository contains 3D variants of popular CNN models for classification like ResNets, DenseNet

69 Dec 28, 2022
Implementation of fast algorithms for Maximum Spanning Tree (MST) parsing that includes fast ArcMax+Reweighting+Tarjan algorithm for single-root dependency parsing.

Fast MST Algorithm Implementation of fast algorithms for (Maximum Spanning Tree) MST parsing that includes fast ArcMax+Reweighting+Tarjan algorithm fo

Miloš Stanojević 11 Oct 14, 2022
Meshed-Memory Transformer for Image Captioning. CVPR 2020

M²: Meshed-Memory Transformer This repository contains the reference code for the paper Meshed-Memory Transformer for Image Captioning (CVPR 2020). Pl

AImageLab 422 Dec 28, 2022
Recurrent Neural Network Tutorial, Part 2 - Implementing a RNN in Python and Theano

Please read the blog post that goes with this code! Jupyter Notebook Setup System Requirements: Python, pip (Optional) virtualenv To start the Jupyter

Denny Britz 863 Dec 15, 2022
Hierarchical Aggregation for 3D Instance Segmentation (ICCV 2021)

HAIS Hierarchical Aggregation for 3D Instance Segmentation (ICCV 2021) by Shaoyu Chen, Jiemin Fang, Qian Zhang, Wenyu Liu, Xinggang Wang*. (*) Corresp

Hust Visual Learning Team 145 Jan 05, 2023
salabim - discrete event simulation in Python

Object oriented discrete event simulation and animation in Python. Includes process control features, resources, queues, monitors. statistical distrib

181 Dec 21, 2022
Consensus score for tripadvisor

ContripScore ContripScore is essentially a score that combines an Internet platform rating and a consensus rating from sentiment analysis (For instanc

Pepe 1 Jan 13, 2022
《Geo Word Clouds》paper implementation

《Geo Word Clouds》paper implementation

Russellwzr 2 Jan 28, 2022
Direct Multi-view Multi-person 3D Human Pose Estimation

Implementation of NeurIPS-2021 paper: Direct Multi-view Multi-person 3D Human Pose Estimation [paper] [video-YouTube, video-Bilibili] [slides] This is

Sea AI Lab 251 Dec 30, 2022
FwordCTF 2021 Infrastructure and Source code of Web/Bash challenges

FwordCTF 2021 You can find here the source code of the challenges I wrote (Web and Bash) in FwordCTF 2021 and the source code of the platform with our

Kahla 5 Nov 25, 2022