An atmospheric growth and evolution model based on the EVo degassing model and FastChem 2.0

Related tags

Deep LearningEVolve
Overview

EVolve

Linking planetary mantles to atmospheric chemistry through volcanism using EVo and FastChem.

Overview

EVolve is a linked mantle degassing and atmospheric growth code, which models the growth of a rocky planet's secondary atmosphere under the influence of volcanism.

Installation

EVolve is written in Python3, and is incompatible with Python 2.7. Two very useful tools to set up python environments:
Pip - package installer for Python
Anaconda - virtual environment manager

  1. Clone the repository with submodules and enter directory

    git clone --recurse-submodules [email protected]:pipliggins/evolve.git
    

    Note: If you don't clone with submodules you won't get the two modules used to run EVolve, the EVo volcanic degassing model and the FastChem equilibrium chemistry code.

  2. Compile FastChem:

    cd fastchem
    git submodules update --init --recursive
    mkdir build & cd build
    cmake -DUSE_PYTHON==ON ..
    make
    

    This will pull the pybind11 module required for the python bindings, and compile both the C++ code, and the python bindings which are used in EVolve to conect to FastChem.

    Note: FastChem is an external C++ module, used to compute atmospheric equilibrium chemistry. Therefore, to run on Windows, I recommend using WSL (Windows Subsystem for Linux) to make the process of compiling the C code easier. If you encounter installation issues relating to the cmake version, I found the accepted answer here to work for me. A list of the suggested terminal commands can also be found at the bottom of this README file.

  3. Install dependencies using either Pip install or Anaconda. Check requirements.txt for full details. If using Pip, install all dependencies from the main directory of EVolve using

    pip3 install -r requirements.txt
    

    Troubleshoot: The GMPY2 module requires several libraries (MPFR and MPC) which are not pre-loaded in some operating systems, particularly Windows. If the GMPY2 module does not install, or you have other install issues, try

    pip3 install wheel
    sudo apt install libgmp-dev libmpfr-dev libmpc-dev
    pip3 install -r requirements.txt
    

Running EVolve

EVolve can be run either with or without the FastChem equilibrium chemistry in the atmosphere. To run Evolve with FastChem, from the main directory of EVolve run

python evolve.py inputs.yaml --fastchem

The available tags are:

  • --fastchem ).This will use fastchem to run equilibrium chemistry in the atmosphere, producing more chemical species than the magma degassing model uses and enabling the atmospheric equilibrium temperature to be lower than magmatic.

  • --nocrust ).This option stops a crustal reservoir from being formed out of the degassed melt which has been erupted. Instead, the degassed melt and any volatiles remaining in it are re-incorporated back into the mantle. If this tag is NOT used, the mantle mass will gradually reduce as there is no mechanism for re-introducing the crustal material back into the mantle implemented here.

All the input models for EVolve, and the submodules EVo and FastChem are stored in the 'inputs' folder:

Filename Relevant module Properties
atm.yaml EVolve main Sets the pre-existing atmospheric chemistry and surface pressures + temperatures for the planet
mantle.yaml EVolve main Sets the initial planetary mantle/rocky body properties, including temperature, mass, fO2, the mantle volatile concentrations and the volcanic intrusive:extrusive ratio
planet.yaml EVolve main Sets generic planetary properties and important run settings, including planetary mass, radius, the amount of mantle melting occurring at each timestep and the size & number of timesteps the model will run.
chem.yaml EVo Contains the major oxide composition of the magma being input to EVo
env.yaml EVo Contains the majority of the run settings and volatile contents for the EVo run.
output.yaml EVo Stops any graphical input from EVo compared to it's default settings
config.input FastChem Sets the names and locations for input and output files for FastChem, and output settings
parameters.dat FastChem Location of elemental abundance files, and configuration parameters

Files highlighted in bold should be edited by the user; all others are optimied for EVolve and/or will be edited by the code as it is running. Explainations for each parameter setting in the EVolve files can be found at the bottom of this README file.

As EVolve runs, it creates and updates files in the outputs folder as follows:

Filename Data
atmosphere_out.csv Planetary surface pressure and atmospheric composition for tracked molecules in units of volume mixing ratios (actually mo fraction), calculated after each time step
mantle_out.csv Mantle volatile budget and fO2 after each timestep
volc_out.csv The final pressure iteration from the EVo output file in each timestep (storing melt volatile contents, atomic volatile contents, gas speciation in mol & wt fractions, etc)
fc_input.csv Generated if fastchem is selected: The input to FastChem after atmospheric mixing, and hydrogen escape if that is occuring, for each timestep.
fc_out.csv Generated if fastchem is selected: The results from FastChem after each timestep

Installation help for WSL

If you see an error saying that the installed version of cmake is too low to install FastChem, try these commands: Please note this is just a suggestion based on what worked for me, try these workarounds at your own risk!

sudo apt-get update
sudo apt-get install apt-transport-https ca-certificates gnupg software-properties-common wget

wget -O - https://apt.kitware.com/keys/kitware-archive-latest.asc 2>/dev/null | sudo apt-key add -

sudo apt-add-repository 'deb https://apt.kitware.com/ubuntu/ bionic main'
sudo apt-get update

sudo apt-get install cmake
Owner
Pip Liggins
3rd year PhD student studying Earth Sciences. I model volcanic degassing chemistry and its impact on planetary atmospheres.
Pip Liggins
B2EA: An Evolutionary Algorithm Assisted by Two Bayesian Optimization Modules for Neural Architecture Search

B2EA: An Evolutionary Algorithm Assisted by Two Bayesian Optimization Modules for Neural Architecture Search This is the offical implementation of the

SNU ADSL 0 Feb 07, 2022
Styled text-to-drawing synthesis method. Featured at the 2021 NeurIPS Workshop on Machine Learning for Creativity and Design

Styled text-to-drawing synthesis method. Featured at the 2021 NeurIPS Workshop on Machine Learning for Creativity and Design

Peter Schaldenbrand 247 Dec 23, 2022
using STGCN to achieve egg classification task

EEG Classification   The task requires us to classify electroencephalography(EEG) into six categories, including human body, human face, animal body,

4 Jun 13, 2022
NeuralTalk is a Python+numpy project for learning Multimodal Recurrent Neural Networks that describe images with sentences.

#NeuralTalk Warning: Deprecated. Hi there, this code is now quite old and inefficient, and now deprecated. I am leaving it on Github for educational p

Andrej 5.3k Jan 07, 2023
Code for reproducible experiments presented in KSD Aggregated Goodness-of-fit Test.

Code for KSDAgg: a KSD aggregated goodness-of-fit test This GitHub repository contains the code for the reproducible experiments presented in our pape

Antonin Schrab 5 Dec 15, 2022
Pyramid Scene Parsing Network, CVPR2017.

Pyramid Scene Parsing Network by Hengshuang Zhao, Jianping Shi, Xiaojuan Qi, Xiaogang Wang, Jiaya Jia, details are in project page. Introduction This

Hengshuang Zhao 1.5k Jan 05, 2023
PyTorch and GPyTorch implementation of the paper "Conditioning Sparse Variational Gaussian Processes for Online Decision-making."

Conditioning Sparse Variational Gaussian Processes for Online Decision-making This repository contains a PyTorch and GPyTorch implementation of the pa

Wesley Maddox 16 Dec 08, 2022
AoT is a system for automatically generating off-target test harness by using build information.

AoT: Auto off-Target Automatically generating off-target test harness by using build information. Brought to you by the Mobile Security Team at Samsun

Samsung 10 Oct 19, 2022
Fully-automated scripts for collecting AI-related papers

AI-Paper-collector Fully-automated scripts for collecting AI-related papers List of Conferences to crawel ACL: 21-19 (including findings) EMNLP: 21-19

Gordon Lee 776 Jan 08, 2023
Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable.

Diffrax Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. Diffrax is a JAX-based library providing numerical differe

Patrick Kidger 717 Jan 09, 2023
The 7th edition of NTIRE: New Trends in Image Restoration and Enhancement workshop will be held on June 2022 in conjunction with CVPR 2022.

NTIRE 2022 - Image Inpainting Challenge Important dates 2022.02.01: Release of train data (input and output images) and validation data (only input) 2

Andrés Romero 37 Nov 27, 2022
This is an official implementation for "PlaneRecNet".

PlaneRecNet This is an official implementation for PlaneRecNet: A multi-task convolutional neural network provides instance segmentation for piece-wis

yaxu 50 Nov 17, 2022
This is an official pytorch implementation of Lite-HRNet: A Lightweight High-Resolution Network.

Lite-HRNet: A Lightweight High-Resolution Network Introduction This is an official pytorch implementation of Lite-HRNet: A Lightweight High-Resolution

HRNet 675 Dec 25, 2022
Stratified Transformer for 3D Point Cloud Segmentation (CVPR 2022)

Stratified Transformer for 3D Point Cloud Segmentation Xin Lai*, Jianhui Liu*, Li Jiang, Liwei Wang, Hengshuang Zhao, Shu Liu, Xiaojuan Qi, Jiaya Jia

DV Lab 195 Jan 01, 2023
4st place solution for the PBVS 2022 Multi-modal Aerial View Object Classification Challenge - Track 1 (SAR) at PBVS2022

A Two-Stage Shake-Shake Network for Long-tailed Recognition of SAR Aerial View Objects 4st place solution for the PBVS 2022 Multi-modal Aerial View Ob

LinpengPan 5 Nov 09, 2022
Time Delayed NN implemented in pytorch

Pytorch Time Delayed NN Time Delayed NN implemented in PyTorch. Usage kernels = [(1, 25), (2, 50), (3, 75), (4, 100), (5, 125), (6, 150)] tdnn = TDNN

Daniil Gavrilov 79 Aug 04, 2022
Article Reranking by Memory-enhanced Key Sentence Matching for Detecting Previously Fact-checked Claims.

MTM This is the official repository of the paper: Article Reranking by Memory-enhanced Key Sentence Matching for Detecting Previously Fact-checked Cla

ICTMCG 13 Sep 17, 2022
Coarse implement of the paper "A Simultaneous Denoising and Dereverberation Framework with Target Decoupling", On DNS-2020 dataset, the DNSMOS of first stage is 3.42 and second stage is 3.47.

SDDNet Coarse implement of the paper "A Simultaneous Denoising and Dereverberation Framework with Target Decoupling", On DNS-2020 dataset, the DNSMOS

Cyril Lv 43 Nov 21, 2022
This repository contains the code for the paper in EMNLP 2021: "HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression".

HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression This repository contains the code for the paper in EM

Chenhe Dong 2 Mar 24, 2022
Implementation of "JOKR: Joint Keypoint Representation for Unsupervised Cross-Domain Motion Retargeting"

JOKR: Joint Keypoint Representation for Unsupervised Cross-Domain Motion Retargeting Pytorch implementation for the paper "JOKR: Joint Keypoint Repres

45 Dec 25, 2022