Projecting interval uncertainty through the discrete Fourier transform

Overview

Projecting interval uncertainty through the discrete Fourier transform

This repository provides a method that can propagate interval uncertainty through the discrete Fourier transform while yielding the exact bounds on the Fourier amplitude and Power Spectral Density function. The algorithm applies to real sequences of intervals. The method allows technical analysts to project interval uncertainty present in the time signals to their Fourier amplitude without making assumptions about the error distribution at each time step. Thus, it is possible to calculate and analyse system responses in the frequency domain without conducting extensive Monte Carlo simulations in the time domain. The applicability of this method in practice is demonstrated by a technical application.

Disclaimer: This code was developed for illustration purposes and for proof-of-concept. Thus this code is not optimized for large-scale applications. An optimized version of the code is currently under development.

References

De Angelis, M.; Behrendt, M.; Comerford, L.; Zhang, Y.; Beer, M. (2021): Forward interval propagation through the discrete Fourier transform, The 9th international workshop on Reliable Engineering Computing, arXiv:2012.09778.

Installation

Clone the git repository on your machine, cd to the repository, open a Python3 interpreter and import the interval Fourier transform ans other useful packages

from fourier.transform import transform as intervalDFT
from fourier.application import application as app
from fourier.number import number as int_num
import numpy
from numpy import (arange, cos, exp, linspace, mean, pi,  sin, zeros) 
from matplotlib import pyplot, cm

Signal generation and interval DFT

At first time and frequency parameters and an analytical PSD function are needed to model a stochastic process.

Define parameters

wu = 2.2975 # upper cut-off frequency
T = 350 # total time length

dt = 2*pi /(2*wu) # timestep size
dw = 2*pi / T # frequency step size

t = numpy.arange(0,T,dt) # time vector
w = numpy.arange(0,wu,dw) # frequency vector

JONSWAP power spectrum

The JONSWAP power spectrum is utilised to generate stochastic processes. The required parameters are:

alpha = 0.0081 # spectral energy parameter
w_p = 0.7 # peak frequency
gamma = 3.3 # peak enhancement factor
sigma1 = 0.07 # spectral width parameter for w <= w_p
sigma2 = 0.09 # spectral width parameter for w > w_p
spectrum = app.jonswap_spectrum(w,alpha,w_p,gamma,sigma1,sigma2)

Plot the JONSWAP power spectrum

ax = app.plot_line(w,spectrum,figsize=(18,6),xlabel=r'#$x$',ylabel='$x$',color=None,lw=1,title='JONSWAP power spectrum',ax=None,label=None)
ax.set_xlabel('Frequency [rad/s]',fontsize=20)
ax.set_ylabel('Power Spectral Density [m$^2$s]',fontsize=20)

fig

Generate time signal and intervalize it

To generate a stochastic process the spectral representation method is utilised. This signal is then intervalized with interval uncertainty ±0.1. Both signals are plotted.

sea_waves = app.stochastic_process(spectrum,w,t) 
pm = 0.1
sea_waves_interval = intervalDFT.intervalize(sea_waves, pm)

ax = app.plot_line(t,sea_waves,figsize=(18,6),xlabel='Time [s]',ylabel='Wave height [m]',color='rebeccapurple',lw=1,title='Signal from stationary power spectrum',ax=None,label=None)
sea_waves_interval.plot(xlabel='Time [s]',ylabel='Wave height [m]',title=r'Signal with $\pm$ '+str(pm)+' information gaps (intervals)')

fig fig

Compute the Fourier transforms

Compute the Fourier transform of the crisp signal and the interval Fourier transform for the interval signal with the selective method and the interval method. Also compute the periodogram of respective (bounded) Fourier amplitudes.

FA = intervalDFT.Fourier_amplitude(sea_waves)
BI,BS = intervalDFT.compute_amplitude_bounds(sea_waves_interval)
BI.insert(0,int_num.Interval(0,0))
BS.insert(0,int_num.Interval(0,0))

FA = app.periodogram(FA, t, dt)
BI = app.periodogram(BI, t, dt)
BS = app.periodogram(BS, t, dt)

Plot the interval Fourier transform

The amplitude of the crisp signal and both bounded Fourier amplituted are plotted.

ax = app.plot_line(w,FA,figsize=(18,6),xlabel=r'#$x$',ylabel=r'$x$',color=None,lw=1,title=None,ax=None,label='Interval uncertainty: $\pm$ '+str(pm)+'')
app.plot_bounds(x=w,bounds=BI,color='cornflowerblue',alpha=0.4,ax=ax)
app.plot_bounds(x=w,bounds=BS,color='orangered',alpha=0.6,ax=ax)
ax.set_xlabel('Frequency [rad/s]',fontsize=20)
ax.set_ylabel('Power Spectral Density [m$^2$s]',fontsize=20)
ax.tick_params(direction='out', length=6, width=2, labelsize=14)

fig

Application to a SDOF system

The system under investigation is a offshore wind turbine simplified to a SDOF system. The parameters are set to

R = 3 # outer radius
r = 2.8 # inner radius
h_pile = 60 # height
rho_steel = 7800 # density of steel
c = 1e5 # stiffness
k = 1e6 # damping coefficient

Get the natural frequency w0 and the damping ratio xi

w0,xi = app.wind_turbine(R,r,h_pile,rho_steel,c,k)

The response can be obtained by pushing the (intervalised) signal through the frequency response function

freq_response_precise = app.frequency_response(w,FA,w0,xi)
freq_response_BI_low,freq_response_BI_high = app.frequency_response_interval(w,BI,w0,xi)
freq_response_BS_low,freq_response_BS_high = app.frequency_response_interval(w,BS,w0,xi)

Those responses can be plotted

ax = app.plot_line(w,freq_response_precise,figsize=(18,6),xlabel=r'#$x$',ylabel=r'$x$',color=None,lw=1,title=None,ax=None,label=None)
ax.fill_between(x=w,y1=freq_response_BI_low,y2=freq_response_BI_high, alpha=0.4, label='Interval', edgecolor='blue', lw=2, color='cornflowerblue')
ax.fill_between(x=w,y1=freq_response_BS_low,y2=freq_response_BS_high, alpha=0.6, label='Selective', edgecolor='red', lw=2, color='orangered')

ax.set_xlabel('Frequency [rad/s]',fontsize=20)
ax.set_ylabel('Power Spectral Density [m$^2$s]',fontsize=20)
ax.set_title(r'Interval uncertainty: $\pm$ '+str(pm)+'', fontsize=20)

ax.tick_params(direction='out', length=6, width=2, labelsize=14)
_=ax.set_xlim([0.5, 1.1])

fig

Comparison with Monte Carlo

In this section it is illustrated how severe interval uncertainty is underestimated by Monte Carlo. To show this, a signal with interval uncertainty ±0.5 is utilised and plotted.

pm = 0.5
sea_waves_interval_05 = intervalDFT.intervalize(sea_waves, pm)
sea_waves_interval_05.plot(xlabel='Time [s]',ylabel='Wave height [m]',title=r'Signal with $\pm$ '+str(pm)+' information gaps (intervals)')

fig

Generate some random signals between the bounds. All signals which are within or on the bounds are possible.

RAND_SIGNALS = sea_waves_interval_05.rand(N=20) # this picks out N (inner) random signals within the bounds

fig,ax = intervalDFT.subplots(figsize=(16,8))
for rs in RAND_SIGNALS:
    intervalDFT.plot_signal(rs,ax=ax)
sea_waves_interval_05.plot(ax=ax)
ax.grid()
_=ax.set_xlim(0,55) # underscore here is used to suppress the output of this line

fig

Computing the Fourier amplitude bounds and the periodogram of the interval signal

BI,BS = intervalDFT.compute_amplitude_bounds(sea_waves_interval_05)
BI.insert(0,int_num.Interval(0,0))
BS.insert(0,int_num.Interval(0,0))

BI = app.periodogram(BI, t, dt)
BS = app.periodogram(BS, t, dt) 

Plotting the bounds of the Fourier amplitude in comparison to the resulting bounds obtained by Monte Carlo

BI_low=[ai.lo() for ai in BI]
BI_high=[ai.hi() for ai in BI]
BS_low=[ai.lo() for ai in BS]
BS_high=[ai.hi() for ai in BS]

fig = pyplot.figure(figsize=(18,6))
ax = fig.subplots()
ax.grid()
ax.fill_between(x=w,y1=BI_low,y2=BI_high, alpha=0.4, label='Interval', edgecolor='blue', lw=2, color='cornflowerblue')
ax.fill_between(x=w,y1=BS_low,y2=BS_high, alpha=0.6, label='Selective', edgecolor='red', lw=2, color='orangered')

n_MC = 10
for x in range(n_MC):
    FX = intervalDFT.Fourier_amplitude(sea_waves_interval_05.rand())
    FX = app.periodogram(FX, t, dt)
    #intervalDFT.plot_line(w,FX,figsize=None,xlabel=r'#$x$',ylabel=r'$x$',color='palegreen',lw=1,title=None,ax=ax,label=None) 
    app.plot_line(w,FX,figsize=None,xlabel=r'#$x$',ylabel=r'$x$',color='#d7f4d7',lw=1,title=None,ax=ax,label=None) 

ax.set_xlabel('Frequency [rad/s]',fontsize=20)
ax.set_ylabel('Power Spectral Density [m$^2$s]',fontsize=20)
ax.set_title(r'Interval uncertainty: $\pm$ '+str(pm)+'', fontsize=20)

ax.tick_params(direction='out', length=6, width=2, labelsize=14)  

fig

Which increasing sample size, the range within the bounds of the interval signal is better covered. However, even a very high sample size is insufficient to get close to the bounds obtained by the interval DFT.

fig = pyplot.figure(figsize=(18,6))
ax = fig.subplots()
ax.grid()

n_MC = 1000
for x in range(n_MC):
    FX = intervalDFT.Fourier_amplitude(sea_waves_interval_05.rand())
    FX = app.periodogram(FX, t, dt)
    app.plot_line(w,FX,figsize=None,xlabel=r'#$x$',ylabel=r'$x$',color='#7cc47c',lw=1,title=None,ax=ax,label=None) 
    
n_MC = 100
for x in range(n_MC):
    FX = intervalDFT.Fourier_amplitude(sea_waves_interval_05.rand())
    FX = app.periodogram(FX, t, dt)
    app.plot_line(w,FX,figsize=None,xlabel=r'#$x$',ylabel=r'$x$',color='#a7d9a7',lw=1,title=None,ax=ax,label=None) 
    
n_MC = 10
for x in range(n_MC):
    FX = intervalDFT.Fourier_amplitude(sea_waves_interval_05.rand())
    FX = app.periodogram(FX, t, dt)
    app.plot_line(w,FX,figsize=None,xlabel=r'#$x$',ylabel=r'$x$',color='#d7f4d7',lw=1,title=None,ax=ax,label=None) 
    
ax.set_xlabel('Frequency [rad/s]',fontsize=20)
ax.set_ylabel('Power Spectral Density [m$^2$s]',fontsize=20)
_=ax.set_title('Bounds estimated by MC', fontsize=20) 

fig

Genpass - A Passwors Generator App With Python3

Genpass Welcom again into another python3 App this is simply an Passwors Generat

Mal4D 1 Jan 09, 2022
CSD: Consistency-based Semi-supervised learning for object Detection

CSD: Consistency-based Semi-supervised learning for object Detection (NeurIPS 2019) By Jisoo Jeong, Seungeui Lee, Jee-soo Kim, Nojun Kwak Installation

80 Dec 15, 2022
Repository for the paper "From global to local MDI variable importances for random forests and when they are Shapley values"

From global to local MDI variable importances for random forests and when they are Shapley values Antonio Sutera ( Antonio Sutera 3 Feb 23, 2022

An image base contains 490 images for learning (400 cars and 90 boats), and another 21 images for testingAn image base contains 490 images for learning (400 cars and 90 boats), and another 21 images for testing

SVM Données Une base d’images contient 490 images pour l’apprentissage (400 voitures et 90 bateaux), et encore 21 images pour fait des tests. Prétrait

Achraf Rahouti 3 Nov 30, 2021
Orthogonal Jacobian Regularization for Unsupervised Disentanglement in Image Generation (ICCV 2021)

Orthogonal Jacobian Regularization for Unsupervised Disentanglement in Image Generation Home | PyTorch BigGAN Discovery | TensorFlow ProGAN Regulariza

Yuxiang Wei 54 Dec 30, 2022
Code accompanying the paper "ProxyFL: Decentralized Federated Learning through Proxy Model Sharing"

ProxyFL Code accompanying the paper "ProxyFL: Decentralized Federated Learning through Proxy Model Sharing" Authors: Shivam Kalra*, Junfeng Wen*, Jess

Layer6 Labs 14 Dec 06, 2022
Source code for CVPR 2021 paper "Riggable 3D Face Reconstruction via In-Network Optimization"

Riggable 3D Face Reconstruction via In-Network Optimization Source code for CVPR 2021 paper "Riggable 3D Face Reconstruction via In-Network Optimizati

130 Jan 02, 2023
Code accompanying "Learning What To Do by Simulating the Past", ICLR 2021.

Learning What To Do by Simulating the Past This repository contains code that implements the Deep Reward Learning by Simulating the Past (Deep RSLP) a

Center for Human-Compatible AI 24 Aug 07, 2021
Simulating Sycamore quantum circuits classically using tensor network algorithm.

Simulating the Sycamore quantum supremacy circuit This repo contains data we have obtained in simulating the Sycamore quantum supremacy circuits with

Feng Pan 46 Nov 17, 2022
LyaNet: A Lyapunov Framework for Training Neural ODEs

LyaNet: A Lyapunov Framework for Training Neural ODEs Provide the model type--config-name to train and test models configured as those shown in the pa

Ivan Dario Jimenez Rodriguez 21 Nov 21, 2022
Unofficial PyTorch implementation of Masked Autoencoders Are Scalable Vision Learners

Unofficial PyTorch implementation of Masked Autoencoders Are Scalable Vision Learners This repository is built upon BEiT, thanks very much! Now, we on

Zhiliang Peng 2.3k Jan 04, 2023
TransPrompt - Towards an Automatic Transferable Prompting Framework for Few-shot Text Classification

TransPrompt This code is implement for our EMNLP 2021's paper 《TransPrompt:Towards an Automatic Transferable Prompting Framework for Few-shot Text Cla

WangJianing 23 Dec 21, 2022
A framework for annotating 3D meshes using the predictions of a 2D semantic segmentation model.

Semantic Meshes A framework for annotating 3D meshes using the predictions of a 2D semantic segmentation model. Paper If you find this framework usefu

Florian 40 Dec 09, 2022
Implementation of paper "DCS-Net: Deep Complex Subtractive Neural Network for Monaural Speech Enhancement"

DCS-Net This is the implementation of "DCS-Net: Deep Complex Subtractive Neural Network for Monaural Speech Enhancement" Steps to run the model Edit V

Jack Walters 10 Apr 04, 2022
An example of Scatterbrain implementation (combining local attention and Performer)

An example of Scatterbrain implementation (combining local attention and Performer)

HazyResearch 97 Jan 02, 2023
StackNet is a computational, scalable and analytical Meta modelling framework

StackNet This repository contains StackNet Meta modelling methodology (and software) which is part of my work as a PhD Student in the computer science

Marios Michailidis 1.3k Dec 15, 2022
X-VLM: Multi-Grained Vision Language Pre-Training

X-VLM: learning multi-grained vision language alignments Multi-Grained Vision Language Pre-Training: Aligning Texts with Visual Concepts. Yan Zeng, Xi

Yan Zeng 286 Dec 23, 2022
Python and C++ implementation of "MarkerPose: Robust real-time planar target tracking for accurate stereo pose estimation". Accepted at LXCV @ CVPR 2021.

MarkerPose: Robust real-time planar target tracking for accurate stereo pose estimation This is a PyTorch and LibTorch implementation of MarkerPose: a

Jhacson Meza 47 Nov 18, 2022
Yolox-bytetrack-sample - Python sample of MOT (Multiple Object Tracking) using YOLOX and ByteTrack

yolox-bytetrack-sample YOLOXとByteTrackを用いたMOT(Multiple Object Tracking)のPythonサン

KazuhitoTakahashi 12 Nov 09, 2022
High-quality single file implementation of Deep Reinforcement Learning algorithms with research-friendly features

CleanRL (Clean Implementation of RL Algorithms) CleanRL is a Deep Reinforcement Learning library that provides high-quality single-file implementation

Costa Huang 1.8k Jan 01, 2023