DeepStruc is a Conditional Variational Autoencoder which can predict the mono-metallic nanoparticle from a Pair Distribution Function.

Overview

ChemRxiv | [Paper] XXX

DeepStruc

Welcome to DeepStruc, a Deep Generative Model (DGM) that learns the relation between PDF and atomic structure and thereby solves a structure from a PDF!

  1. DeepStruc
  2. Getting started (with Colab)
  3. Getting started (own computer)
    1. Install requirements
    2. Simulate data
    3. Train model
    4. Predict
  4. Author
  5. Cite
  6. Acknowledgments
  7. License

We here apply DeepStruc for the structural analysis of a model system of mono-metallic nanoparticle (MMNPs) with seven different structure types and demonstrate the method for both simulated and experimental PDFs. DeepStruc can reconstruct simulated data with an average mean absolute error (MAE) of the atom xyz-coordinates on 0.093 ± 0.058 Å after fitting a contraction/extraction factor, an ADP and a scale parameter. We demonstrate the generative capability of DeepStruc on a dataset of face-centered cubic (fcc), hexagonal closed packed (hcp) and stacking faulted structures, where DeepStruc can recognize the stacking faulted structures as an interpolation between fcc and hcp and construct new structural models based on a PDF. The MAE is in this example 0.030 ± 0.019 Å.

The MMNPs are provided as a graph-based input to the encoder of DeepStruc. We compare DeepStruc with a similar DGM without the graph-based encoder. DeepStruc is able to reconstruct the structures using a smaller dimension of the latent space thus having a better generative capabillity. We also compare DeepStruc with a brute-force modelling approach and a tree-based classification algorithm. The ML models are significantly faster than the brute-force approach, but DeepStruc can furthermore create a latent space from where synthetic structures can be sampled which the tree-based method cannot! The baseline models can be found in other repositories: brute-force, MetalFinder and CVAE. alt text

Getting started (with Colab)

Using DeepStruc on your own PDFs is straightforward and does not require anything installed or downloaded to your computer. Follow the instructions in our Colab notebook and try to play around.

Getting started (own computer)

Follow these step if you want to train DeepStruc and predict with DeepStruc locally on your own computer.

Install requirements

See the install folder.

Simulate data

See the data folder.

Train model

To train your own DeepStruc model simply run:

python train.py

A list of possible arguments or run the '--help' argument for additional information.
If you are intersted in changing the architecture of the model go to train.py and change the model_arch dictionary.

Arg Description Example
-h or --help Prints help message.
-d or --data_dir Directory containing graph training, validation and test data. str -d ./data/graphs
-s or --save_dir Directory where models will be saved. This is also used for loading a learner. str -s bst_model
-r or --resume_model If 'True' the save_dir model is loaded and training is continued. bool -r True
-e or --epochs Number of maximum epochs. int -e 100
-b or --batch_size Number of graphs in each batch. int -b 20
-l or --learning_rate Learning rate. float -l 1e-4
-B or --beta Initial beta value for scaling KLD. float -B 0.1
-i or --beta_increase Increments of beta when the threshold is met. float -i 0.1
-x or --beta_max Highst value beta can increase to. float -x 5
-t or --reconstruction_th Reconstruction threshold required before beta is increased. float -t 0.001
-n or --num_files Total number of files loaded. Files will be split 60/20/20. If 'None' then all files are loaded. int -n 500
-c or --compute Train model on CPU or GPU. Choices: 'cpu', 'gpu16', 'gpu32' and 'gpu64'. str -c gpu32
-L or --latent_dim Number of latent space dimensions. int -L 3

Predict

To predict a MMNP using DeepStruc or your own model on a PDF:

python predict.py

A list of possible arguments or run the '--help' argument for additional information.

Arg Description Example
-h or --help Prints help message.
-d or --data Path to data or data directory. If pointing to data directory all datasets must have same format. str -d data/experimental_PDFs/JQ_S1.gr
-m or --model Path to model. If 'None' GUI will open. str -m ./models/DeepStruc
-n or --num_samples Number of samples/structures generated for each unique PDF. int -n 10
-s or --sigma Sample to '-s' sigma in the normal distribution. float -s 7
-p or --plot_sampling Plots sampled structures on top of DeepStruc training data. Model must be DeepStruc. bool -p True
-g or --save_path Path to directory where predictions will be saved. bool -g ./best_preds
-i or --index_plot Highlights specific reconstruction in the latent space. --data must be specific file and not directory and '--plot True'. int -i 4
-P or --plot_data If True then the first loaded PDF is plotted and shown after normalization. bool -P ./best_preds

Authors

Andy S. Anker1
Emil T. S. Kjær1
Marcus N. Weng1
Simon J. L. Billinge2, 3
Raghavendra Selvan4, 5
Kirsten M. Ø. Jensen1

1 Department of Chemistry and Nano-Science Center, University of Copenhagen, 2100 Copenhagen Ø, Denmark.
2 Department of Applied Physics and Applied Mathematics Science, Columbia University, New York, NY 10027, USA.
3 Condensed Matter Physics and Materials Science Department, Brookhaven National Laboratory, Upton, NY 11973, USA.
4 Department of Computer Science, University of Copenhagen, 2100 Copenhagen Ø, Denmark.
5 Department of Neuroscience, University of Copenhagen, 2200, Copenhagen N.

Should there be any question, desired improvement or bugs please contact us on GitHub or through email: [email protected] or [email protected].

Cite

If you use our code or our results, please consider citing our papers. Thanks in advance!

@article{kjær2022DeepStruc,
title={DeepStruc: Towards structure solution from pair distribution function data using deep generative models},
author={Emil T. S. Kjær, Andy S. Anker, Marcus N. Weng, Simon J. L. Billinge, Raghavendra Selvan, Kirsten M. Ø. Jensen},
year={2022}}
@article{anker2020characterising,
title={Characterising the atomic structure of mono-metallic nanoparticles from x-ray scattering data using conditional generative models},
author={Anker, Andy Sode and Kjær, Emil TS and Dam, Erik B and Billinge, Simon JL and Jensen, Kirsten MØ and Selvan, Raghavendra},
year={2020}}

Acknowledgments

Our code is developed based on the the following publication:

@article{anker2020characterising,
title={Characterising the atomic structure of mono-metallic nanoparticles from x-ray scattering data using conditional generative models},
author={Anker, Andy Sode and Kjær, Emil TS and Dam, Erik B and Billinge, Simon JL and Jensen, Kirsten MØ and Selvan, Raghavendra},
year={2020}}

License

This project is licensed under the Apache License Version 2.0, January 2004 - see the LICENSE file for details.

Owner
Emil Thyge Skaaning Kjær
Ph.D student in nanoscience at the University of Copenhagen.
Emil Thyge Skaaning Kjær
[ICCV 2021] FaPN: Feature-aligned Pyramid Network for Dense Image Prediction

FaPN: Feature-aligned Pyramid Network for Dense Image Prediction [arXiv] [Project Page] @inproceedings{ huang2021fapn, title={{FaPN}: Feature-alig

Shihua Huang 23 Jul 22, 2022
Lighting the Darkness in the Deep Learning Era: A Survey, An Online Platform, A New Dataset

Lighting the Darkness in the Deep Learning Era: A Survey, An Online Platform, A New Dataset This repository provides a unified online platform, LoLi-P

Chongyi Li 457 Jan 03, 2023
Source codes for "Structure-Aware Abstractive Conversation Summarization via Discourse and Action Graphs"

Structure-Aware-BART This repo contains codes for the following paper: Jiaao Chen, Diyi Yang:Structure-Aware Abstractive Conversation Summarization vi

GT-SALT 56 Dec 08, 2022
Data-depth-inference - Data depth inference with python

Welcome! This readme will guide you through the use of the code in this reposito

Marco 3 Feb 08, 2022
Collection of tasks for fast prototyping, baselining, finetuning and solving problems with deep learning.

Collection of tasks for fast prototyping, baselining, finetuning and solving problems with deep learning Installation

Pytorch Lightning 1.6k Jan 08, 2023
Pytorch implementation of Nueral Style transfer

Nueral Style Transfer Pytorch implementation of Nueral style transfer algorithm , it is used to apply artistic styles to content images . Content is t

Abhinav 9 Oct 15, 2022
Yggdrasil - A simplistic bot designed to streamline your server experience

Ygggdrasil A simplistic bot designed to streamline your server experience. Desig

Sntx_ 1 Dec 14, 2022
People Interaction Graph

Gihan Jayatilaka*, Jameel Hassan*, Suren Sritharan*, Janith Senananayaka, Harshana Weligampola, et. al., 2021. Holistic Interpretation of Public Scenes Using Computer Vision and Temporal Graphs to Id

University of Peradeniya : COVID Research Group 1 Aug 24, 2022
Implementation of Shape and Electrostatic similarity metric in deepFMPO.

DeepFMPO v3D Code accompanying the paper "On the value of using 3D-shape and electrostatic similarities in deep generative methods". The paper can be

34 Nov 28, 2022
Next-gen Rowhammer fuzzer that uses non-uniform, frequency-based patterns.

Blacksmith Rowhammer Fuzzer This repository provides the code accompanying the paper Blacksmith: Scalable Rowhammering in the Frequency Domain that is

Computer Security Group @ ETH Zurich 173 Nov 16, 2022
Differentiable architecture search for convolutional and recurrent networks

Differentiable Architecture Search Code accompanying the paper DARTS: Differentiable Architecture Search Hanxiao Liu, Karen Simonyan, Yiming Yang. arX

Hanxiao Liu 3.7k Jan 09, 2023
A framework that allows people to write their own Rocket League bots.

YOU PROBABLY SHOULDN'T PULL THIS REPO Bot Makers Read This! If you just want to make a bot, you don't need to be here. Instead, start with one of thes

543 Dec 20, 2022
A SAT-based sudoku solver

SAT Sudoku solver A SAT-based Sudoku solver made in the context of a small project in the "Logic Problem Solving" class in the first year at the Polyt

Alexandre Malfreyt 5 Apr 15, 2022
This repository contains the code for EMNLP-2021 paper "Word-Level Coreference Resolution"

Word-Level Coreference Resolution This is a repository with the code to reproduce the experiments described in the paper of the same name, which was a

79 Dec 27, 2022
Code for Active Learning at The ImageNet Scale.

Code for Active Learning at The ImageNet Scale. This repository implements many popular active learning algorithms and allows training with torch's DDP.

Zeyad Emam 47 Dec 12, 2022
Unofficial Implementation of Oboe (SIGCOMM'18').

Oboe-Reproduce This is the unofficial implementation of the paper "Oboe: Auto-tuning video ABR algorithms to network conditions, Zahaib Akhtar, Yun Se

Tianchi Huang 13 Nov 04, 2022
Repo 4 basic seminar §How to make human machine readable"

WORK IN PROGRESS... Notebooks from the Seminar: Human Machine Readable WS21/22 Introduction into programming Georg Trogemann, Christian Heck, Mattis

experimental-informatics 3 May 29, 2022
Details about the wide minima density hypothesis and metrics to compute width of a minima

wide-minima-density-hypothesis Details about the wide minima density hypothesis and metrics to compute width of a minima This repo presents the wide m

Nikhil Iyer 9 Dec 27, 2022
A Number Recognition algorithm

Paddle-VisualAttention Results_Compared SVHN Dataset Methods Steps GPU Batch Size Learning Rate Patience Decay Step Decay Rate Training Speed (FPS) Ac

1 Nov 12, 2021
PyElastica is the Python implementation of Elastica, an open-source software for the simulation of assemblies of slender, one-dimensional structures using Cosserat Rod theory.

PyElastica PyElastica is the python implementation of Elastica: an open-source project for simulating assemblies of slender, one-dimensional structure

Gazzola Lab 105 Jan 09, 2023