Analyses of the individual electric field magnitudes with Roast.

Overview

Aloi Davide - PhD Student (UoB)

Analysis of electric field magnitudes (wp2a dataset only at the moment) and correlation analysis with Dynamic Causal Modelling (DCM) results.

The goal of these analyses is to establish whether there is a relationship between single-subject electric field (E-field) magnitudes generated with the ROAST pipeline (Huang et al., 2019) and changes in effective connectivity within the motor network, derived using DCM and parametric empirical bayes (PEB).

The two analyses are:

  1. Correlation analysis between E-field magnitude - medians and max values - (or the current density?) in the motor cortex (M1) and Thalamus (Th) with self- / between-connectivities (M1 and Th only?) as derived from the DCM. e.g. Indahlastari et al. (2021). At the moment I am correlating e-field measures only with DCM measures derived from the contrast pre vs post Day-1 anodal only. However, I should also correlate those e-field measures with DCM measures derived from the contrast pre vs post Day-1 sham. I expect to find correlations between e-field measures and DCM measures for the anodal condition but not for sham.
  2. Pattern-recognition analysis using support vector machine (SVM) learning algorithm on MRI-derived tDCS current models to provide classification of tDCS treatment response (as reflected by increased M1-TH or TH-M1 connectivity or whatever other measure we decide). e.g. Albizu et al. (2020). The question here is: can we classify people who had an increase in thalamo-cortical connectivity using features from the MRI-current models?

The two analyses require similar preprocessing steps. Here's the list of the steps I've done and the respective scripts.

WP2A: I start from a dataset containing 22 folders (one per participant), each containing a T1 and a T2 scan (except for subject 16 who has only a T1).

  1. Renaming of anatomical scans: this renames the anatomical scans of each participant (i.e. sub-01_T1.nii etc).
  2. ROAST simulations: this script runs the ROAST simulations. In brief, ROAST outputs the following scans for each subject, while also using SPM routines for tissue segmentation: Voltage ("subjName_simulationTag_v.nii", unit in mV), E-field ("subjName_simulationTag_e.nii", unit in V/m) and E-field magnitude ("subjName_simulationTag_emag.nii", unit in V/m). The settings I have used for the simulation are: (t1, {'C3',1.0,'Fp2',-1.0},'T2', t2,'electype', 'pad', 'elecsize', [50 50 3], 'capType', '1020').
  3. Post ROAST preprocessing: ROAST outputs are in the ROAST model space. This script moves the results back to the MRI space, coregisters and normalises the electric field maps generated by ROAST. The script also normalise the T1 scan and all the masks.
  4. Ep values extraction from PEB result (Day-1 only): this script, starting from this .mat structure containing 66 PEBs (1 per participant / polarity), extracts the Ep values for each participant. The resulting file contains 66 matrices (participant 1 anodal, cathodal and sham, participant 2 ... 22).
  5. Estimation of posterior probability associated to each PEB extracted above. The script runs bayesian model averages for each PEB using the DCM function spm_dcm_peb_bmc. Results are saved in this .mat structure and used later on in the analyses to exclude connections with a posterior probability lower than 75%.
  6. WP2a e-magnitude measures estimation and correlation analysis. Steps:
    1. Load MNI template and M1/Th ROIs.
    2. Load .mat structure with Ep values and .mat structure with Pp values (Nb. Pp values are not used anymore);
    3. For each subject:
      1. Load normalised scan containing E-field magnitude (wsub-T1_emag.nii), normalised CSF, white and grey matter maps (wc1-2-3sub*.nii).
      2. Save DCM values related to the connections M1-M1, Th-Th, M1->Th and Th-> M1;
      3. Smooth E-field magnitude map using FWHM (4mm kernel);
      4. Mask E-field magnitude map with MNI template to exclude values outside the brain (useless if I then mask with CSF, wm and gm maps or with the M1/Th ROIs);
      5. Mask E-field magnitude map with M1 and Th ROIs and estimate means, medians and max electric-field values within the two ROIs;
      6. Save electric-field magnitude derived measures;
      7. Plot smoothed E-field magnitude map;
      8. Run 16 correlations: 4 DCM measures and 4 E-field measures (medians and max values).
      9. Plot correlations.

Questions:

  1. Electric field magnitudes or current densities?
  2. If so, how to deal with probabilistic masks?
  3. Should I threshold WM masks and apply binary erosion to remove the overlap between WM and GM?
  4. How to deal with Ep values which corresponding Pp is lower than our threshold (75%?)
  5. Should I mask out CSF tissue? Should I use a binary map containing only WM and GM?
  6. Hypotheses? Ideas?

Plots: Sticky note mind map - Sticky note mind map

References:

  1. Huang, Y., Datta, A., Bikson, M., & Parra, L. C. (2019). Realistic volumetric-approach to simulate transcranial electric stimulation—ROAST—a fully automated open-source pipeline. Journal of Neural Engineering, 16(5), 056006. https://doi.org/10.1088/1741-2552/ab208d
  2. Indahlastari, A., Albizu, A., Kraft, J. N., O’Shea, A., Nissim, N. R., Dunn, A. L., Carballo, D., Gordon, M. P., Taank, S., Kahn, A. T., Hernandez, C., Zucker, W. M., & Woods, A. J. (2021). Individualized tDCS modeling predicts functional connectivity changes within the working memory network in older adults. Brain Stimulation, 14(5), 1205–1215. https://doi.org/10.1016/j.brs.2021.08.003
  3. Albizu, A., Fang, R., Indahlastari, A., O’Shea, A., Stolte, S. E., See, K. B., Boutzoukas, E. M., Kraft, J. N., Nissim, N. R., & Woods, A. J. (2020). Machine learning and individual variability in electric field characteristics predict tDCS treatment response. Brain Stimulation, 13(6), 1753–1764. https://doi.org/10.1016/j.brs.2020.10.001
Owner
Davide Aloi
Doctoral Researcher at the University of Birmingham, UK. Centre for Human Brain Health. Investigating Disorders of Consciousness with fMRI and tDCS.
Davide Aloi
A Streamlit component to render ECharts.

Streamlit - ECharts A Streamlit component to display ECharts. Install pip install streamlit-echarts Usage This library provides 2 functions to display

Fanilo Andrianasolo 290 Dec 30, 2022
This is code to fit per-pixel environment map with spherical Gaussian lobes, using LBFGS optimization

Spherical Gaussian Optimization This is code to fit per-pixel environment map with spherical Gaussian lobes, using LBFGS optimization. This code has b

41 Dec 14, 2022
DCSL - Generalizable Crowd Counting via Diverse Context Style Learning

DCSL Generalizable Crowd Counting via Diverse Context Style Learning Requirement

3 Jun 13, 2022
Realistic lighting in ursina!

Ursina Lighting Realistic lighting in ursina! If you want to have realistic lighting in ursina, import the UrsinaLighting.py in your project and use t

17 Jul 07, 2022
Collection of generative models in Pytorch version.

pytorch-generative-model-collections Original : [Tensorflow version] Pytorch implementation of various GANs. This repository was re-implemented with r

Hyeonwoo Kang 2.4k Dec 31, 2022
"SOLQ: Segmenting Objects by Learning Queries", SOLQ is an end-to-end instance segmentation framework with Transformer.

SOLQ: Segmenting Objects by Learning Queries This repository is an official implementation of the paper SOLQ: Segmenting Objects by Learning Queries.

MEGVII Research 179 Jan 02, 2023
A Low Complexity Speech Enhancement Framework for Full-Band Audio (48kHz) based on Deep Filtering.

DeepFilterNet A Low Complexity Speech Enhancement Framework for Full-Band Audio (48kHz) based on Deep Filtering. libDF contains Rust code used for dat

Hendrik Schröter 292 Dec 25, 2022
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks

MEAL-V2 This is the official pytorch implementation of our paper: "MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tric

Zhiqiang Shen 653 Dec 19, 2022
[ ICCV 2021 Oral ] Our method can estimate camera poses and neural radiance fields jointly when the cameras are initialized at random poses in complex scenarios (outside-in scenes, even with less texture or intense noise )

GNeRF This repository contains official code for the ICCV 2021 paper: GNeRF: GAN-based Neural Radiance Field without Posed Camera. This implementation

Quan Meng 191 Dec 26, 2022
Bagua is a flexible and performant distributed training algorithm development framework.

Bagua is a flexible and performant distributed training algorithm development framework.

786 Dec 17, 2022
Raptor-Multi-Tool - Raptor Multi Tool With Python

Promises 🔥 20 Stars and I'll fix every error that there is 50 Stars and we will

Aran 44 Jan 04, 2023
Official implementation of the method ContIG, for self-supervised learning from medical imaging with genomics

ContIG: Self-supervised Multimodal Contrastive Learning for Medical Imaging with Genetics This is the code implementation of the paper "ContIG: Self-s

Digital Health & Machine Learning 22 Dec 13, 2022
Official Implementation of DDOD (Disentangle your Dense Object Detector), ACM MM2021

Disentangle Your Dense Object Detector This repo contains the supported code and configuration files to reproduce object detection results of Disentan

loveSnowBest 51 Jan 07, 2023
AdaNet is a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention

AdaNet is a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention. AdaNet buil

3.4k Jan 07, 2023
Python script for performing depth completion from sparse depth and rgb images using the msg_chn_wacv20. model in Tensorflow Lite.

TFLite-msg_chn_wacv20-depth-completion Python script for performing depth completion from sparse depth and rgb images using the msg_chn_wacv20. model

Ibai Gorordo 2 Oct 04, 2021
A annotation of yolov5-5.0

代码版本:0714 commit #4000 $ git clone https://github.com/ultralytics/yolov5 $ cd yolov5 $ git checkout 720aaa65c8873c0d87df09e3c1c14f3581d4ea61 这个代码只是注释版

Laughing 229 Dec 17, 2022
A Pytorch implementation of CVPR 2021 paper "RSG: A Simple but Effective Module for Learning Imbalanced Datasets"

RSG: A Simple but Effective Module for Learning Imbalanced Datasets (CVPR 2021) A Pytorch implementation of our CVPR 2021 paper "RSG: A Simple but Eff

120 Dec 12, 2022
Label Mask for Multi-label Classification

LM-MLC 一种基于完型填空的多标签分类算法 1 前言 本文主要介绍本人在全球人工智能技术创新大赛【赛道一】设计的一种基于完型填空(模板)的多标签分类算法:LM-MLC,该算法拟合能力很强能感知标签关联性,在多个数据集上测试表明该算法与主流算法无显著性差异,在该比赛数据集上的dev效果很好,但是由

52 Nov 20, 2022
Project ArXiv Citation Network

Project ArXiv Citation Network Overview This project involved the analysis of the ArXiv citation network. Usage The complete code of this project is i

Dennis Núñez-Fernández 5 Oct 20, 2022
A PyTorch implementation of EfficientDet.

A PyTorch impl of EfficientDet faithful to the original Google impl w/ ported weights

Ross Wightman 1.4k Jan 07, 2023