[NeurIPS '21] Adversarial Attacks on Graph Classification via Bayesian Optimisation (GRABNEL)

Related tags

Deep Learninggrabnel
Overview

Adversarial Attacks on Graph Classification via Bayesian Optimisation @ NeurIPS 2021

overall-pipeline

This repository contains the official implementation of GRABNEL, a Bayesian optimisation-based adversarial agent to conduct adversarial attacks on graph classification models. GRABNEL currently supports various topological attacks, such as via edge flipping (incl. both addition or deletion), node injection and edge swapping. We also include implementations of a number of baseline methods including random search, genetic algorithm [1] and a gradient-based white-box attacker (available on some victim model choices). We also implement a number of victim models, namely:

  • Graph convolution networks (GCN) [2]
  • Graph isomorphism networks (GIN) [3]
  • ChebyGIN [4] (only for MNIST-75sp task)
  • Graph U-Net [5]
  • S2V (only for the ER Graph task in [1])

For details please take a look at our paper: abstract / pdf.

The code repository also contains instructions for the TU datasets [6] in the DGL framework, as well as the MNIST-75sp dataset in [4]. For the Twitter dataset we used for node injection tasks, we are not authorised to redistribute the dataset and you have to ask for permission from the authors of [7] directly.

If you find our work to be useful for your research, please consider citing us:

Wan, Xingchen, Henry Kenlay, Binxin Ru, Arno Blaas, Michael A. Osborne, and Xiaowen Dong. "Adversarial Attacks on Graph Classifiers via Bayesian Optimisation." In Thirty-Fifth Conference on Neural Information Processing Systems. 2021.

Or in bibtex:

@inproceedings{wan2021adversarial,
  title={Adversarial Attacks on Graph Classifiers via Bayesian Optimisation},
  author={Wan, Xingchen and Kenlay, Henry and Ru, Binxin and Blaas, Arno and Osborne, Michael and Dong, Xiaowen},
  booktitle={Thirty-Fifth Conference on Neural Information Processing Systems},
  year={2021}
}

Instructions for use

  1. Install the required packages in requirements.txt

For TU Dataset(s):

  1. Train a selected architecture (GCN/GIN). Taking an example of GCN training on the PROTEINS dataset. By default DGL will download the requested dataset under ~/.dgl directory. If it throws an error, you might have to manually download the dataset and add to the appropriate directory.
python3 train_model.py --dataset PROTEINS --model gcn --seed $YOUR_SEED 

This by default deposits the trained victim model under src/output/models and the training log under src/output/training_logs.

  1. Evaluate the victim model on a separate test set. Run
python3 evaluate_model.py --dataset PROTEINS --seed $YOUR_SEED  --model gcn

This by default will create evaluation logs under src/output/evaluation_logs.

  1. Run the attack algorithm.
cd scripts && python3 run_bo_tu.py --dataset PROTEINS --save_path $YOUR_SAVE_PATH --model_path $YOUR_MODEL_PATH --seed $YOUR_SEED --model gcn

With no method specified, the script runs GRABNEL by default. You may use the -m to specify if, for example, you'd like to run one of the baseline methods mentioned above instead.

For the MNIST-75sp task For MNIST-75sp, we use the pre-trained model released by the authors of [4] as the victim model, so there is no need to train a victim model separately (unless you wish to).

  1. Generate the MNIST-75sp dataset. Here we use an adapted script from [4], but added a converter to ensure that the dataset generated complies with the rest of our code base (DGL-compliant, etc). You need to download the MNIST dataset beforehand (or use the torchvision download facility. Either is fine)
cd data && python3 build_mnist.py -D mnist -d $YOUR_DATA_PATH -o $YOUR_SAVE_PATH  

The output should be a pickle file mnist_75sp.p. Place it under $PROJECT_ROOT/src/data/

  1. Download the pretrained model from https://github.com/bknyaz/graph_attention_pool. The pretrained checkpointed model we use is checkpoint_mnist-75sp_139255_epoch30_seed0000111.pth.tar. Deposit the model under src/output/models

  2. Run attack algorithm.

cd scripts && python3 run_bo_image_classification.py --dataset mnist

References

[1] Dai, Hanjun, Hui Li, Tian Tian, Xin Huang, Lin Wang, Jun Zhu, and Le Song. "Adversarial attack on graph structured data." In International conference on machine learning, pp. 1115-1124. PMLR, 2018.

[2] Kipf, Thomas N., and Max Welling. "Semi-supervised classification with graph convolutional networks." arXiv preprint arXiv:1609.02907 (2016).

[3] Xu, Keyulu, Weihua Hu, Jure Leskovec, and Stefanie Jegelka. "How powerful are graph neural networks?." arXiv preprint arXiv:1810.00826 (2018).

[4] Knyazev, Boris, Graham W. Taylor, and Mohamed R. Amer. "Understanding attention and generalization in graph neural networks." NeurIPS (2019).

[5] Gao, Hongyang, and Shuiwang Ji. "Graph u-nets." In international conference on machine learning, pp. 2083-2092. PMLR, 2019.

[6] Morris, Christopher, Nils M. Kriege, Franka Bause, Kristian Kersting, Petra Mutzel, and Marion Neumann. "Tudataset: A collection of benchmark datasets for learning with graphs." arXiv preprint arXiv:2007.08663 (2020).

[7] Vosoughi, Soroush, Deb Roy, and Sinan Aral. "The spread of true and false news online." Science 359, no. 6380 (2018): 1146-1151.

Acknowledgements

The repository builds, directly or indirectly, on multiple open-sourced code bases available online. The authors would like to express their gratitudes towards the maintainers of the following repos:

  1. https://github.com/Hanjun-Dai/graph_adversarial_attack
  2. https://github.com/DSE-MSU/DeepRobust
  3. https://github.com/HongyangGao/Graph-U-Nets
  4. https://github.com/xingchenwan/nasbowl
  5. The Deep graph library team
  6. The grakel team (https://ysig.github.io/GraKeL/0.1a8/)
Owner
Xingchen Wan
PhD Student in Machine Learning @ University of Oxford
Xingchen Wan
An onlinel learning to rank python codebase.

OLTR Online learning to rank python codebase. The code related to Pairwise Differentiable Gradient Descent (ranker/PDGDLinearRanker.py) is copied from

ielab 5 Jul 18, 2022
The most simple and minimalistic navigation dashboard.

Navigation This project follows a goal to have simple and lightweight dashboard with different links. I use it to have my own self-hosted service dash

Yaroslav 23 Dec 23, 2022
Pytorch implementation of forward and inverse Haar Wavelets 2D

Pytorch implementation of forward and inverse Haar Wavelets 2D

Sergei Belousov 9 Oct 30, 2022
pytorch implementation for Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network arXiv:1609.04802

PyTorch SRResNet Implementation of Paper: "Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network"(https://arxiv.org/abs

Jiu XU 436 Jan 09, 2023
Ranger deep learning optimizer rewrite to use newest components

Ranger21 - integrating the latest deep learning components into a single optimizer Ranger deep learning optimizer rewrite to use newest components Ran

Less Wright 266 Dec 28, 2022
A framework for analyzing computer vision models with simulated data

3DB: A framework for analyzing computer vision models with simulated data Paper Quickstart guide Blog post Installation Follow instructions on: https:

3DB 112 Jan 01, 2023
A Survey on Deep Learning Technique for Video Segmentation

A Survey on Deep Learning Technique for Video Segmentation A Survey on Deep Learning Technique for Video Segmentation Wenguan Wang, Tianfei Zhou, Fati

Tianfei Zhou 112 Dec 12, 2022
Certis - Certis, A High-Quality Backtesting Engine

Certis - Backtesting For y'all Certis is a powerful, lightweight, simple backtes

Yeachan-Heo 46 Oct 30, 2022
The Environment I built to study Reinforcement Learning + Pokemon Showdown

pokemon-showdown-rl-environment The Environment I built to study Reinforcement Learning + Pokemon Showdown Been a while since I ran this. Think it is

3 Jan 16, 2022
Implementation of Bottleneck Transformer in Pytorch

Bottleneck Transformer - Pytorch Implementation of Bottleneck Transformer, SotA visual recognition model with convolution + attention that outperforms

Phil Wang 621 Jan 06, 2023
Self-Supervised depth kalilia

Self-Supervised depth kalilia

24 Oct 15, 2022
Hierarchical Time Series Forecasting with a familiar API

scikit-hts Hierarchical Time Series with a familiar API. This is the result from not having found any good implementations of HTS on-line, and my work

Carlo Mazzaferro 204 Dec 17, 2022
Feature board for ERPNext

ERPNext Feature Board Feature board for ERPNext Development Prerequisites k3d kubectl helm bench Install K3d Cluster # export K3D_FIX_CGROUPV2=1 # use

Revant Nandgaonkar 16 Nov 09, 2022
ColossalAI-Examples - Examples of training models with hybrid parallelism using ColossalAI

ColossalAI-Examples This repository contains examples of training models with Co

HPC-AI Tech 185 Jan 09, 2023
A community run, 5-day PyTorch Deep Learning Bootcamp

Deep Learning Winter School, November 2107. Tel Aviv Deep Learning Bootcamp : http://deep-ml.com. About Tel-Aviv Deep Learning Bootcamp is an intensiv

Shlomo Kashani. 1.3k Sep 04, 2021
LeafSnap replicated using deep neural networks to test accuracy compared to traditional computer vision methods.

Deep-Leafsnap Convolutional Neural Networks have become largely popular in image tasks such as image classification recently largely due to to Krizhev

Sujith Vishwajith 48 Nov 27, 2022
HNECV: Heterogeneous Network Embedding via Cloud model and Variational inference

HNECV This repository provides a reference implementation of HNECV as described in the paper: HNECV: Heterogeneous Network Embedding via Cloud model a

4 Jun 28, 2022
This project is the PyTorch implementation of our CVPR 2022 paper:

Requirements and Dependency Install PyTorch with CUDA (for GPU). (Experiments are validated on python 3.8.11 and pytorch 1.7.0) (For visualization if

Lei Huang 23 Nov 29, 2022
nnFormer: Interleaved Transformer for Volumetric Segmentation

nnFormer: Interleaved Transformer for Volumetric Segmentation Code for paper "nnFormer: Interleaved Transformer for Volumetric Segmentation ". Please

jsguo 610 Dec 28, 2022
Code of our paper "Contrastive Object-level Pre-training with Spatial Noise Curriculum Learning"

CCOP Code of our paper Contrastive Object-level Pre-training with Spatial Noise Curriculum Learning Requirement Install OpenSelfSup Install Detectron2

Chenhongyi Yang 21 Dec 13, 2022