PSML: A Multi-scale Time-series Dataset for Machine Learning in Decarbonized Energy Grids

Overview

PSML: A Multi-scale Time-series Dataset for Machine Learning in Decarbonized Energy Grids

The electric grid is a key enabling infrastructure for the ambitious transition towards carbon neutrality as we grapple with climate change. With deepening penetration of renewable energy resources and electrified transportation, the reliable and secure operation of the electric grid becomes increasingly challenging. In this paper, we present PSML, a first-of-its-kind open-access multi-scale time-series dataset, to aid in the development of data-driven machine learning (ML) based approaches towards reliable operation of future electric grids. The dataset is generated through a novel transmission + distribution (T+D) co-simulation designed to capture the increasingly important interactions and uncertainties of the grid dynamics, containing electric load, renewable generation, weather, voltage and current measurements at multiple spatio-temporal scales. Using PSML, we provide state-of-the-art ML baselines on three challenging use cases of critical importance to achieve: (i) early detection, accurate classification and localization of dynamic disturbance events; (ii) robust hierarchical forecasting of load and renewable energy with the presence of uncertainties and extreme events; and (iii) realistic synthetic generation of physical-law-constrained measurement time series. We envision that this dataset will enable advances for ML in dynamic systems, while simultaneously allowing ML researchers to contribute towards carbon-neutral electricity and mobility.

Dataset Navigation

We put Full dataset in Zenodo. Please download, unzip and put somewhere for later benchmark results reproduction and data loading and performance evaluation for proposed methods.

wget https://zenodo.org/record/5130612/files/PSML.zip?download=1
7z x 'PSML.zip?download=1' -o./

Minute-level Load and Renewable

  • File Name
    • ISO_zone_#.csv: CAISO_zone_1.csv contains minute-level load, renewable and weather data from 2018 to 2020 in the zone 1 of CAISO.
  • Field Description
    • Field time: Time of minute resolution.
    • Field load_power: Normalized load power.
    • Field wind_power: Normalized wind turbine power.
    • Field solar_power: Normalized solar PV power.
    • Field DHI: Direct normal irradiance.
    • Field DNI: Diffuse horizontal irradiance.
    • Field GHI: Global horizontal irradiance.
    • Field Dew Point: Dew point in degree Celsius.
    • Field Solar Zeinth Angle: The angle between the sun's rays and the vertical direction in degree.
    • Field Wind Speed: Wind speed (m/s).
    • Field Relative Humidity: Relative humidity (%).
    • Field Temperature: Temperature in degree Celsius.

Minute-level PMU Measurements

  • File Name
    • case #: The case 0 folder contains all data of scenario setting #0.
      • pf_input_#.txt: Selected load, renewable and solar generation for the simulation.
      • pf_result_#.csv: Voltage at nodes and power on branches in the transmission system via T+D simualtion.
  • Filed Description
    • Field time: Time of minute resolution.
    • Field Vm_###: Voltage magnitude (p.u.) at the bus ### in the simulated model.
    • Field Va_###: Voltage angle (rad) at the bus ### in the simulated model.
    • Field P_#_#_#: P_3_4_1 means the active power transferring in the #1 branch from the bus 3 to 4.
    • Field Q_#_#_#: Q_5_20_1 means the reactive power transferring in the #1 branch from the bus 5 to 20.

Millisecond-level PMU Measurements

  • File Name
    • Forced Oscillation: The folder contains all forced oscillation cases.
      • row_#: The folder contains all data of the disturbance scenario #.
        • dist.csv: Three-phased voltage at nodes in the distribution system via T+D simualtion.
        • info.csv: This file contains the start time, end time, location and type of the disturbance.
        • trans.csv: Voltage at nodes and power on branches in the transmission system via T+D simualtion.
    • Natural Oscillation: The folder contains all natural oscillation cases.
      • row_#: The folder contains all data of the disturbance scenario #.
        • dist.csv: Three-phased voltage at nodes in the distribution system via T+D simualtion.
        • info.csv: This file contains the start time, end time, location and type of the disturbance.
        • trans.csv: Voltage at nodes and power on branches in the transmission system via T+D simualtion.
  • Filed Description

    trans.csv

    • Field Time(s): Time of millisecond resolution.
    • Field VOLT ###: Voltage magnitude (p.u.) at the bus ### in the transmission model.
    • Field POWR ### TO ### CKT #: POWR 151 TO 152 CKT '1 ' means the active power transferring in the #1 branch from the bus 151 to 152.
    • Field VARS ### TO ### CKT #: VARS 151 TO 152 CKT '1 ' means the reactive power transferring in the #1 branch from the bus 151 to 152.

    dist.csv

    • Field Time(s): Time of millisecond resolution.
    • Field ####.###.#: 3005.633.1 means per-unit voltage magnitude of the phase A at the bus 633 of the distribution grid, the one connecting to the bus 3005 in the transmission system.

Installation

  • Install PSML from source.
git clone https://github.com/tamu-engineering-research/Open-source-power-dataset.git
  • Create and activate anaconda virtual environment
conda create -n PSML python=3.7.10
conda activate PSML
  • Install required packages
pip install -r ./Code/requirements.txt

Package Usage

We've prepared the standard interfaces of data loaders and evaluators for all of the three time series tasks:

(1) Data loaders

We prepare the following Pytorch data loaders, with both data processing and splitting included. You can easily load data with a few lines for different tasks by simply modifying the task parameter.

from Code.dataloader import TimeSeriesLoader

loader = TimeSeriesLoader(task='forecasting', root='./PSML') # suppose the raw dataset is downloaded and unzipped under Open-source-power-dataset
train_loader, test_loader = loader.load(batch_size=32, shuffle=True)

(2) Evaluators

We also provide evaluators to support fair comparison among different approaches. The evaluator receives the dictionary input_dict (we specify key and value format of different tasks in evaluator.expected_input_format), and returns another dictionary storing the performance measured by task-specific metrics (explanation of key and value can be found in evaluator.expected_output_format).

from Code.evaluator import TimeSeriesEvaluator
evaluator = TimeSeriesEvaluator(task='classification', root='./PSML') # suppose the raw dataset is downloaded and unzipped under Open-source-power-dataset
# learn the appropriate format of input_dict
print(evaluator.expected_input_format) # expected input_dict format
print(evaluator.expected_output_format) # expected output dict format
# prepare input_dict
input_dict = {
    'classification': classfication,
    'localization': localization,
    'detection': detection,
}
result_dict = evaluator.eval(input_dict)
# sample output: {'#samples': 110, 'classification': 0.6248447204968943, 'localization': 0.08633372048006195, 'detection': 42.59349593495935}

Code Navigation

Please see detailed explanation and comments in each subfolder.

  • BenchmarkModel
    • EventClassification: baseline models for event detection, classification and localization
    • LoadForecasting: baseline models for hierarchical load and renewable point forecast and prediction interval
    • Synthetic Data Generation: baseline models for synthetic data generation of physical-laws-constrained PMU measurement time series
  • Joint Simulation: python codes for joint steady-state and transient simulation between transmission and distribution systems
  • Data Processing: python codes for collecting the real-world load and weather data

License

The PSML dataset is published under CC BY-NC 4.0 license, meaning everyone can use it for non-commercial research purpose.

Suggested Citation

  • Please cite the following paper when you use this data hub:
    X. Zheng, N. Xu, L. Trinh, D. Wu, T. Huang, S. Sivaranjani, Y. Liu, and L. Xie, "PSML: A Multi-scale Time-series Dataset for Machine Learning in Decarbonized Energy Grids." (2021).

Contact

Please contact us if you need further technical support or search for cooperation. Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Email contact:   Le Xie,   Yan Liu,   Xiangtian Zheng,   Nan Xu,   Dongqi Wu,   Loc Trinh,   Tong Huang,   S. Sivaranjani.

You might also like...
EMNLP 2021: Single-dataset Experts for Multi-dataset Question-Answering

MADE (Multi-Adapter Dataset Experts) This repository contains the implementation of MADE (Multi-adapter dataset experts), which is described in the pa

PyTorch implementation of Algorithm 1 of "On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models"

Code for On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models This repository will reproduce the main results from our pape

 Learning Energy-Based Models by Diffusion Recovery Likelihood
Learning Energy-Based Models by Diffusion Recovery Likelihood

Learning Energy-Based Models by Diffusion Recovery Likelihood Ruiqi Gao, Yang Song, Ben Poole, Ying Nian Wu, Diederik P. Kingma Paper: https://arxiv.o

[NeurIPS 2021] Code for Unsupervised Learning of Compositional Energy Concepts

Unsupervised Learning of Compositional Energy Concepts This is the pytorch code for the paper Unsupervised Learning of Compositional Energy Concepts.

tsai is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series classification, regression and forecasting.
tsai is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series classification, regression and forecasting.

Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai

A universal framework for learning timestamp-level representations of time series

TS2Vec This repository contains the official implementation for the paper Learning Timestamp-Level Representations for Time Series with Hierarchical C

Code of U2Fusion: a unified unsupervised image fusion network for multiple image fusion tasks, including multi-modal, multi-exposure and multi-focus image fusion.

U2Fusion Code of U2Fusion: a unified unsupervised image fusion network for multiple image fusion tasks, including multi-modal (VIS-IR, medical), multi

A PyTorch implementation of
A PyTorch implementation of "Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning", IJCAI-21

MERIT A PyTorch implementation of our IJCAI-21 paper Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning. Depen

Releases(v1.0.0)
  • v1.0.0(Nov 10, 2021)

    The electric grid is a key enabling infrastructure for the ambitious transition towards carbon neutrality as we grapple with climate change. With deepening penetration of renewable energy resources and electrified transportation, the reliable and secure operation of the electric grid becomes increasingly challenging. In this paper, we present PSML, a first-of-its-kind open-access multi-scale time-series dataset, to aid in the development of data-driven machine learning based approaches towards reliable operation of future electric grids. The dataset is generated through a novel transmission + distribution co-simulation designed to capture the increasingly important interactions and uncertainties of the grid dynamics, containing electric load, renewable generation, weather, voltage and current measurements at multiple spatio-temporal scales. Using PSML, we provide state-of-the-art ML baselines on three challenging use cases of critical importance to achieve: (i) early detection, accurate classification and localization of dynamic disturbance events; (ii) robust hierarchical forecasting of load and renewable energy with the presence of uncertainties and extreme events; and (iii) realistic synthetic generation of physical-law-constrained measurement time series. We envision that this dataset will provide use-inspired ML research in dynamic safety-critical systems, while simultaneously enabling ML researchers to contribute towards decarbonization of energy sectors.

    Source code(tar.gz)
    Source code(zip)
Owner
Texas A&M Engineering Research
Texas A&M Engineering Research
PyTorch code for our ECCV 2020 paper "Single Image Super-Resolution via a Holistic Attention Network"

HAN PyTorch code for our ECCV 2020 paper "Single Image Super-Resolution via a Holistic Attention Network" This repository is for HAN introduced in the

五维空间 140 Nov 23, 2022
An efficient PyTorch implementation of the winning entry of the 2017 VQA Challenge.

Bottom-Up and Top-Down Attention for Visual Question Answering An efficient PyTorch implementation of the winning entry of the 2017 VQA Challenge. The

Hengyuan Hu 731 Jan 03, 2023
Py-FEAT: Python Facial Expression Analysis Toolbox

Py-FEAT is a suite for facial expressions (FEX) research written in Python. This package includes tools to detect faces, extract emotional facial expressions (e.g., happiness, sadness, anger), facial

Computational Social Affective Neuroscience Laboratory 147 Jan 06, 2023
Image-Adaptive YOLO for Object Detection in Adverse Weather Conditions

Image-Adaptive YOLO for Object Detection in Adverse Weather Conditions Accepted by AAAI 2022 [arxiv] Wenyu Liu, Gaofeng Ren, Runsheng Yu, Shi Guo, Jia

liuwenyu 245 Dec 16, 2022
The repository includes the code for training cell counting applications. (Keras + Tensorflow)

cell_counting_v2 The repository includes the code for training cell counting applications. (Keras + Tensorflow) Dataset can be downloaded here : http:

Weidi 113 Oct 06, 2022
Implementation of Bottleneck Transformer in Pytorch

Bottleneck Transformer - Pytorch Implementation of Bottleneck Transformer, SotA visual recognition model with convolution + attention that outperforms

Phil Wang 621 Jan 06, 2023
You Only 👀 One Sequence

You Only 👀 One Sequence TL;DR: We study the transferability of the vanilla ViT pre-trained on mid-sized ImageNet-1k to the more challenging COCO obje

Hust Visual Learning Team 666 Jan 03, 2023
Deep ViT Features as Dense Visual Descriptors

dino-vit-features [paper] [project page] Official implementation of the paper "Deep ViT Features as Dense Visual Descriptors". We demonstrate the effe

Shir Amir 113 Dec 24, 2022
Pytorch implementation of Hinton's Dynamic Routing Between Capsules

pytorch-capsule A Pytorch implementation of Hinton's "Dynamic Routing Between Capsules". https://arxiv.org/pdf/1710.09829.pdf Thanks to @naturomics fo

Tim Omernick 625 Oct 27, 2022
Improving Contrastive Learning by Visualizing Feature Transformation, ICCV 2021 Oral

Improving Contrastive Learning by Visualizing Feature Transformation This project hosts the codes, models and visualization tools for the paper: Impro

Bingchen Zhao 83 Dec 15, 2022
Grammar Induction using a Template Tree Approach

Gitta Gitta ("Grammar Induction using a Template Tree Approach") is a method for inducing context-free grammars. It performs particularly well on data

Thomas Winters 36 Nov 15, 2022
An official source code for "Augmentation-Free Self-Supervised Learning on Graphs"

Augmentation-Free Self-Supervised Learning on Graphs An official source code for Augmentation-Free Self-Supervised Learning on Graphs paper, accepted

Namkyeong Lee 59 Dec 01, 2022
Depth-Aware Video Frame Interpolation (CVPR 2019)

DAIN (Depth-Aware Video Frame Interpolation) Project | Paper Wenbo Bao, Wei-Sheng Lai, Chao Ma, Xiaoyun Zhang, Zhiyong Gao, and Ming-Hsuan Yang IEEE C

Wenbo Bao 7.7k Dec 31, 2022
First-Order Probabilistic Programming Language

FOPPL: A First-Order Probabilistic Programming Language This is an implementation of FOPPL, an S-expression based probabilistic programming language d

Renato Costa 23 Dec 20, 2022
Autolfads-tf2 - A TensorFlow 2.0 implementation of Latent Factor Analysis via Dynamical Systems (LFADS) and AutoLFADS

autolfads-tf2 A TensorFlow 2.0 implementation of LFADS and AutoLFADS. Installati

Systems Neural Engineering Lab 11 Oct 29, 2022
Attentive Implicit Representation Networks (AIR-Nets)

Attentive Implicit Representation Networks (AIR-Nets) Preprint | Supplementary | Accepted at the International Conference on 3D Vision (3DV) teaser.mo

29 Dec 07, 2022
Code and Resources for the Transformer Encoder Reasoning Network (TERN)

Transformer Encoder Reasoning Network Code for the cross-modal visual-linguistic retrieval method from "Transformer Reasoning Network for Image-Text M

Nicola Messina 53 Dec 30, 2022
GitHub repository for "Improving Video Generation for Multi-functional Applications"

Improving Video Generation for Multi-functional Applications GitHub repository for "Improving Video Generation for Multi-functional Applications" Pape

Bernhard Kratzwald 328 Dec 07, 2022
HackBMU-5.0-Team-Ctrl-Alt-Elite - HackBMU 5.0 Team Ctrl Alt Elite

HackBMU-5.0-Team-Ctrl-Alt-Elite The search is over. We present to you ‘Health-A-

3 Feb 19, 2022
Avalanche RL: an End-to-End Library for Continual Reinforcement Learning

Avalanche RL: an End-to-End Library for Continual Reinforcement Learning Avalanche Website | Getting Started | Examples | Tutorial | API Doc | Paper |

ContinualAI 43 Dec 24, 2022