Node-level Graph Regression with Deep Gaussian Process Models

Overview

Node-level Graph Regression with Deep Gaussian Process Models

Prerequests

our implementation is mainly based on tensorflow 1.x and gpflow 1.x:

python 3.x (3.7 tested)
conda install tensorflow-gpu==1.15
pip install keras==2.3.1
pip install gpflow==1.5
pip install gpuinfo

Besides, some basic packages like numpy are also needed. It's maybe easy to wrap the codes for TF2.0 and GPflow2, but it's not tested yet.

Specification

Source code and experiment result are both provided. Unzip two archive files before using experiment notebooks.

Files

  • dgp_graph/: cores codes of the DGPG model.
    • impl_parallel.py: a fast node-level computation parallelized implementation, invoked by all experiments.
    • my_op.py: some custom tensorflow operations used in the implementation.
    • impl.py: a basic loop-based implementation, easy to understand but not practical, leaving just for calibration.
  • data/: datasets.
  • doubly_stochastic_dgp/: codes from repository DGP
  • compatible/: codes to make the DGP source codes compatible with gpflow1.5.
  • gpflow_monitor/: monitoring tool for gpflow models, from this repo.
  • GRN inference: code and data for the GRN inference experiment.
  • demo_city45.ipynb: jupyter notebooks for city45 dataset experiment.
  • experiments.zip: jupyter notebooks for other experiments.
  • results.zip: contains original jupyter notebooks results. (exported as HTML files for archive)
  • run_toy.sh: shell script to run additional experiment.
  • toy_main.py: code for additional experiment (Traditional ML methods and DGPG with linear kernel).
  • ER-0.1.ipynb: example script for analyzing time-varying graph structures.

Experiments

The experiments are based on python src files and demonstrated by jupyter notebooks. The source of an experiment is under directory src/experiments.zip and the corresponding result is exported as a static HTML file stored in the directory results.zip. They are organized by dataset names:

  1. Synthetic Datasets

For theoretical analysis.

  • demo_toy_run1.ipynb

  • demo_toy_run2.ipynb

  • demo_toy_run3.ipynb

  • demo_toy_run4.ipynb

  • demo_toy_run5.ipynb

For graph signal analysis on time-varying graphs.

  • ER-0.05.ipynb

  • ER-0.2.ipynb

  • RWP-0.1.ipynb

  • RWP-0.2.ipynb

  • RWP-0.3.ipynb

  1. Small Datasets
  • demo_city45.ipynb
  • demo_city45_linear.ipynb (linear kernel)
  • demo_city45_baseline.ipynb (traditional regression methods)
  • demo_etex.ipynb
  • demo_etex_linear.ipynb
  • demo_etex_baseline.ipynb
  • demo_fmri.ipynb
  • demo_fmri_linear.ipynb
  • demo_fmri_baseline.ipynb
  1. Large Datasets (traffic flow prediction)
  • LA
    • demo_la_15min.ipynb
    • demo_la_30min.ipynb
    • demo_la_60min.ipynb
  • BAY
    • demo_bay_15min.ipynb
    • demo_bay_30min.ipynb
    • demo_bay_60min.ipynb
PyTorch implementation of MICCAI 2018 paper "Liver Lesion Detection from Weakly-labeled Multi-phase CT Volumes with a Grouped Single Shot MultiBox Detector"

Grouped SSD (GSSD) for liver lesion detection from multi-phase CT Note: the MICCAI 2018 paper only covers the multi-phase lesion detection part of thi

Sang-gil Lee 36 Oct 12, 2022
Install alphafold on the local machine, get out of docker.

AlphaFold This package provides an implementation of the inference pipeline of AlphaFold v2.0. This is a completely new model that was entered in CASP

Kui Xu 73 Dec 13, 2022
Codebase for "Revisiting spatio-temporal layouts for compositional action recognition" (Oral at BMVC 2021).

Revisiting spatio-temporal layouts for compositional action recognition Codebase for "Revisiting spatio-temporal layouts for compositional action reco

Gorjan 20 Dec 15, 2022
Code for one-stage adaptive set-based HOI detector AS-Net.

AS-Net Code for one-stage adaptive set-based HOI detector AS-Net. Mingfei Chen*, Yue Liao*, Si Liu, Zhiyuan Chen, Fei Wang, Chen Qian. "Reformulating

Mingfei Chen 45 Dec 09, 2022
[CVPR'22] Official PyTorch Implementation of Collaborative Transformers for Grounded Situation Recognition

[CVPR'22] Collaborative Transformers for Grounded Situation Recognition Paper | Model Checkpoint This is the official PyTorch implementation of Collab

Junhyeong Cho 29 Dec 10, 2022
Pytorch implementation of U-Net, R2U-Net, Attention U-Net, and Attention R2U-Net.

pytorch Implementation of U-Net, R2U-Net, Attention U-Net, Attention R2U-Net U-Net: Convolutional Networks for Biomedical Image Segmentation https://a

leejunhyun 2k Jan 02, 2023
Wanli Li and Tieyun Qian: Exploit a Multi-head Reference Graph for Semi-supervised Relation Extraction, IJCNN 2021

MRefG Wanli Li and Tieyun Qian: "Exploit a Multi-head Reference Graph for Semi-supervised Relation Extraction", IJCNN 2021 1. Requirements To reproduc

万理 5 Jul 26, 2022
RATCHET is a Medical Transformer for Chest X-ray Diagnosis and Reporting

RATCHET: RAdiological Text Captioning for Human Examined Thoraxes RATCHET is a Medical Transformer for Chest X-ray Diagnosis and Reporting. Based on t

26 Nov 14, 2022
Python and Julia in harmony.

PythonCall & JuliaCall Bringing Python® and Julia together in seamless harmony: Call Python code from Julia and Julia code from Python via a symmetric

Christopher Rowley 414 Jan 07, 2023
Does Oversizing Improve Prosumer Profitability in a Flexibility Market? - A Sensitivity Analysis using PV-battery System

Does Oversizing Improve Prosumer Profitability in a Flexibility Market? - A Sensitivity Analysis using PV-battery System The possibilities to involve

Babu Kumaran Nalini 0 Nov 19, 2021
SAGE: Sensitivity-guided Adaptive Learning Rate for Transformers

SAGE: Sensitivity-guided Adaptive Learning Rate for Transformers This repo contains our codes for the paper "No Parameters Left Behind: Sensitivity Gu

Chen Liang 23 Nov 07, 2022
ViDT: An Efficient and Effective Fully Transformer-based Object Detector

ViDT: An Efficient and Effective Fully Transformer-based Object Detector by Hwanjun Song1, Deqing Sun2, Sanghyuk Chun1, Varun Jampani2, Dongyoon Han1,

NAVER AI 262 Dec 27, 2022
Automates Machine Learning Pipeline with Feature Engineering and Hyper-Parameters Tuning :rocket:

MLJAR Automated Machine Learning Documentation: https://supervised.mljar.com/ Source Code: https://github.com/mljar/mljar-supervised Table of Contents

MLJAR 2.4k Dec 31, 2022
SelfAugment extends MoCo to include automatic unsupervised augmentation selection.

SelfAugment extends MoCo to include automatic unsupervised augmentation selection. In addition, we've included the ability to pretrain on several new datasets and included a wandb integration.

Colorado Reed 24 Oct 26, 2022
Layer 7 DDoS Panel with Cloudflare Bypass ( UAM, CAPTCHA, BFM, etc.. )

Blood Deluxe DDoS DDoS Attack Panel includes CloudFlare Bypass (UAM, CAPTCHA, BFM, etc..)(It works intermittently. Working on it) Don't attack any web

272 Nov 01, 2022
AAAI-22 paper: SimSR: Simple Distance-based State Representationfor Deep Reinforcement Learning

SimSR Code and dataset for the paper SimSR: Simple Distance-based State Representationfor Deep Reinforcement Learning (AAAI-22). Requirements We assum

7 Dec 19, 2022
yufan 81 Dec 08, 2022
PyTorch implementation for STIN

STIN This repository contains PyTorch implementation for STIN. Abstract: In single-photon LiDAR, photon-efficient imaging captures the 3D structure of

Yiweins 2 Nov 22, 2022
Predicts an answer in yes or no.

Oui-ou-non-prediction Predicts an answer in 'yes' or 'no'. It is based on the game 'effeuiller la marguerite' in which the person plucks flower petals

Ananya Gupta 1 Jan 15, 2022
PURE: End-to-End Relation Extraction

PURE: End-to-End Relation Extraction This repository contains (PyTorch) code and pre-trained models for PURE (the Princeton University Relation Extrac

Princeton Natural Language Processing 657 Jan 09, 2023