Julia package for multiway (inverse) covariance estimation.

Overview

TensorGraphicalModels

TensorGraphicalModels.jl is a suite of Julia tools for estimating high-dimensional multiway (tensor-variate) covariance and inverse covariance matrices.

Installation

] add https://github.com/ywa136/TensorGraphicalModels.jl

Examples

Please check out a Julia colab created for illustration of some functionalities of the package. Here are some basic examples as well:

Example code for fitting a KP inverse covariance model:

using TensorGraphicalModels

model_type = "kp"
sub_model_type = "sb" #this defines the structure of the Kronecker factors, sb = star-block
K = 3
N = 1000
d_list = [5, 10, 15]

X = gen_kronecker_data(model_type, sub_model_type, K, N, d_list) #multi-dimensional array (tensor) of dimension d_1 × … × d_K × N
Ψ_hat_list = kglasso(X)

Example code for fitting a KS inverse covariance model:

using TensorGraphicalModels

model_type = "ks"
sub_model_type = "sb" #this defines the structure of the Kronecker factors, sb = star-block
K = 3
N = 1000
d_list = [5, 10, 15]

X = gen_kronecker_data(model_type, sub_model_type, K, N, d_list, tensorize_out = false) #matrix of dimension d × N

# compute the mode-k Gram matrices (the sufficient statistics for TeraLasso)
X_kGram = [zeros(d_list[k], d_list[k]) for k = 1:K]
Xk = [zeros(d_list[k], Int(prod(d_list) / d_list[k])) for k = 1:K]
for k = 1:K
    for i = 1:N
        copy!(Xk[k], tenmat(reshape(view(X, :, i), d_list), k))
        mul!(X_kGram[k], Xk[k], copy(transpose(Xk[k])), 1.0 / N, 1.0)
    end
end

Ψ_hat_list, _ = teralasso(X_kGram)

Example code for fitting a Sylvester inverse covariance model:

using TensorGraphicalModels

model_type = "sylvester"
sub_model_type = "sb" #this defines the structure of the Kronecker factors, sb = star-block
K = 3
N = 1000
d_list = [5, 10, 15]

X = gen_kronecker_data(model_type, sub_model_type, K, N, d_list, tensorize_out = false) #matrix of dimension d × N

# compute the mode-k Gram matrices (the sufficient statistics for TeraLasso)
X_kGram = [zeros(d_list[k], d_list[k]) for k = 1:K]
Xk = [zeros(d_list[k], Int(prod(d_list) / d_list[k])) for k = 1:K]
for k = 1:K
    for i = 1:N
        copy!(Xk[k], tenmat(reshape(view(X, :, i), d_list), k))
        mul!(X_kGram[k], Xk[k], copy(transpose(Xk[k])), 1.0 / N, 1.0)
    end
end

Psi0 = [sparse(eye(d_list[k])) for k = 1:K]
fun = (iter, Psi) -> [1, time()] # NULL func
lambda = [sqrt(px[k] * log(prod(d_list)) / N) for k = 1:K] 

Ψ_hat_list, _ = syglasso_palm(X, X_kGram, lambda, Psi0, fun = fun)

Example code for fitting a KPCA covariance model:

using TensorGraphicalModels

px = py = 25 #works for K=2 modes only
N = 100
X = zeros((px * py, N))

for i=1:N
    X[:, i] .= vec(rand(MatrixNormal(zeros((px, py)), ScalMat(px, 2.0), ScalMat(py, 4.0))))
end

S = cov(copy(X')) #sample covariance matrix
lambdaL = 20 * (px^2 + py^2 + log(max(px, py, N))) / N
lambdaS = 20 * sqrt(log(px * py)/N)

# robust Kronecker PCA methods using singular value thresholding
Sigma_hat = robust_kron_pca(S, px, py, lambdaL, lambdaS, "SVT"; tau = 0.5, r = 5)
Owner
Wayne Wang
Ph.D. candidate in statistics
Wayne Wang
Instant-nerf-pytorch - NeRF trained SUPER FAST in pytorch

instant-nerf-pytorch This is WORK IN PROGRESS, please feel free to contribute vi

94 Nov 22, 2022
High-quality single file implementation of Deep Reinforcement Learning algorithms with research-friendly features

CleanRL (Clean Implementation of RL Algorithms) CleanRL is a Deep Reinforcement Learning library that provides high-quality single-file implementation

Costa Huang 1.8k Jan 01, 2023
Generating Videos with Scene Dynamics

Generating Videos with Scene Dynamics This repository contains an implementation of Generating Videos with Scene Dynamics by Carl Vondrick, Hamed Pirs

Carl Vondrick 706 Jan 04, 2023
Pyramid addon for OpenAPI3 validation of requests and responses.

Validate Pyramid views against an OpenAPI 3.0 document Peace of Mind The reason this package exists is to give you peace of mind when providing a REST

Pylons Project 79 Dec 30, 2022
Direct design of biquad filter cascades with deep learning by sampling random polynomials.

IIRNet Direct design of biquad filter cascades with deep learning by sampling random polynomials. Usage git clone https://github.com/csteinmetz1/IIRNe

Christian J. Steinmetz 55 Nov 02, 2022
This repository contains code from the paper "TTS-GAN: A Transformer-based Time-Series Generative Adversarial Network"

TTS-GAN: A Transformer-based Time-Series Generative Adversarial Network This repository contains code from the paper "TTS-GAN: A Transformer-based Tim

Intelligent Multimodal Computing and Sensing Laboratory (IMICS Lab) - Texas State University 108 Dec 29, 2022
PyTorch implementation of VAGAN: Visual Feature Attribution Using Wasserstein GANs

Prototypical Networks for Few shot Learning in PyTorch Simple alternative Implementation of Prototypical Networks for Few Shot Learning (paper, code)

Orobix 93 Aug 17, 2022
This is an example of a reproducible modelling project

An example of a reproducible modelling project What are we doing? This example was created for the 2021 fall lecture series of Stanford's Center for O

Armin Thomas 2 Oct 26, 2021
Implementation of TabTransformer, attention network for tabular data, in Pytorch

Tab Transformer Implementation of Tab Transformer, attention network for tabular data, in Pytorch. This simple architecture came within a hair's bread

Phil Wang 420 Jan 05, 2023
[CVPR 2021] Generative Hierarchical Features from Synthesizing Images

[CVPR 2021] Generative Hierarchical Features from Synthesizing Images

GenForce: May Generative Force Be with You 148 Dec 09, 2022
A heterogeneous entity-augmented academic language model based on Open Academic Graph (OAG)

Library | Paper | Slack We released two versions of OAG-BERT in CogDL package. OAG-BERT is a heterogeneous entity-augmented academic language model wh

THUDM 58 Dec 17, 2022
PyTorch code for our paper "Attention in Attention Network for Image Super-Resolution"

Under construction... Attention in Attention Network for Image Super-Resolution (A2N) This repository is an PyTorch implementation of the paper "Atten

Haoyu Chen 71 Dec 30, 2022
👨‍💻 run nanosaur in simulation with Gazebo/Ingnition

🦕 👨‍💻 nanosaur_gazebo nanosaur The smallest NVIDIA Jetson dinosaur robot, open-source, fully 3D printable, based on ROS2 & Isaac ROS. Designed & ma

nanosaur 9 Jul 19, 2022
Project page for the paper Semi-Supervised Raw-to-Raw Mapping 2021.

Project page for the paper Semi-Supervised Raw-to-Raw Mapping 2021.

Mahmoud Afifi 22 Nov 08, 2022
[ACL-IJCNLP 2021] "EarlyBERT: Efficient BERT Training via Early-bird Lottery Tickets"

EarlyBERT This is the official implementation for the paper in ACL-IJCNLP 2021 "EarlyBERT: Efficient BERT Training via Early-bird Lottery Tickets" by

VITA 13 May 11, 2022
Exploration-Exploitation Dilemma Solving Methods

Exploration-Exploitation Dilemma Solving Methods Medium article for this repo - HERE In ths repo I implemented two techniques for tackling mentioned t

Aman Mishra 6 Jan 25, 2022
This is the official repository of the paper Stocastic bandits with groups of similar arms (NeurIPS 2021). It contains the code that was used to compute the figures and experiments of the paper.

Experiments How to reproduce experimental results of Stochastic bandits with groups of similar arms submitted paper ? Section 5 of the paper To reprod

Fabien 0 Oct 25, 2021
The authors' implementation of Unsupervised Adversarial Learning of 3D Human Pose from 2D Joint Locations

Unsupervised Adversarial Learning of 3D Human Pose from 2D Joint Locations This is the authors' implementation of Unsupervised Adversarial Learning of

Dwango Media Village 140 Dec 07, 2022
Image Processing, Image Smoothing, Edge Detection and Transforms

opevcvdl-hw1 This project uses openCV and Qt to achieve the requirements. Version Python 3.7 opencv-contrib-python 3.4.2.17 Matplotlib 3.1.1 pyqt5 5.1

Kenny Cheng 3 Aug 17, 2022