Final report with code for KAIST Course KSE 801.

Overview

🧮 KSE 801 Final Report with Code

This is the final report with code for KAIST course KSE 801.

Author: Chuanbo Hua, Federico Berto.

💡 Introduction About the OSC

Orthogonal collocation is a method for the numerical solution of partial differential equations. It uses collocation at the zeros of some orthogonal polynomials to transform the partial differential equation (PDE) to a set of ordinary differential equations (ODEs). The ODEs can then be solved by any method. It has been shown that it is usually advantageous to choose the collocation points as the zeros of the corresponding Jacobi polynomial (independent of the PDE system) [1].

Orthogonal collocation method was famous at 1970s, mainly developed by BA Finlayson [2]. Which is a powerful collocation tool in solving partial differential equations and ordinary differential equations.

Orthogonal collocation method works for more than one variable, but here we only choose one variable cases, since this is more simple to understand and most widely used.

💡 Introduction About the GNN

You can find more details from the jupter notebook within gnn-notebook folder. We include the dataset init, model training and test in the folder.

Reminder: for dataset, we provide another repository for dataset generator. Please refer to repo: https://github.com/DiffEqML/pde-dataset-generator.

🏷 Features

  • Turoritals. We provide several examples, including linear and nonlinear problems to help you to understand how to use it and the performance of this model.
  • Algorithm Explanation. We provide a document to in detail explain how this alogirthm works by example, which we think it's easier to get. For more detail, please refer to Algorithm section.

⚙️ Requirement

Python Version: 3.6 or later
Python Package: numpy, matplotlib, jupyter-notebook/jupyter-lab, dgl, torch

🔧 Structure

  • src: source code for OSC algorithm.
  • fig: algorithm output figures for readme
  • osc-notebook: tutorial jupyter notebooks about our osc method
  • gnn-notebook: tutorial jupyter notebooks about graph neural network
  • script: some training and tesing script of the graph neural network

🔦 How to use

Step 1. Download or Clone this repository.

Step 2. Refer to osc-notebook/example.ipynb, it will introduce how to use this model in detail by examples. Main process would be

  1. collocation1d(): generate collocation points.
  2. generator1d(): generate algebra equations from PDEs to be solved.
  3. numpy.linalg.solve(): solve the algebra equations to get polynomial result,
  4. polynomial1d(): generate simulation value to check the loss.

Step 3. Refer to notebooks under gnn-notebook to get the idea of training graph model.

📈 Examples

One variable, linear, 3 order Loss: <1e-4

One variable, linear, 4 order Loss: 2.2586

One variable, nonlinear Loss: 0.0447

2D PDEs Simulation

Dam Breaking Simulation

📜 Algorithm

Here we are going to simply introduce how 1D OSC works by example. Original pdf please refer to Introduction.pdf in this repository.

📚 References

[1] Orthogonal collocation. (2018, January 30). In Wikipedia. https://en.wikipedia.org/wiki/Orthogonal_collocation.

[2] Carey, G. F., and Bruce A. Finlayson. "Orthogonal collocation on finite elements." Chemical Engineering Science 30.5-6 (1975): 587-596.

Owner
Chuanbo HUA
HIT, POSTECH, KAIST.
Chuanbo HUA
MoViNets PyTorch implementation: Mobile Video Networks for Efficient Video Recognition;

MoViNet-pytorch Pytorch unofficial implementation of MoViNets: Mobile Video Networks for Efficient Video Recognition. Authors: Dan Kondratyuk, Liangzh

189 Dec 20, 2022
Spectralformer: Rethinking hyperspectral image classification with transformers

The code in this toolbox implements the "Spectralformer: Rethinking hyperspectral image classification with transformers". More specifically, it is detailed as follow.

Danfeng Hong 104 Jan 04, 2023
🔥RandLA-Net in Tensorflow (CVPR 2020, Oral & IEEE TPAMI 2021)

RandLA-Net: Efficient Semantic Segmentation of Large-Scale Point Clouds (CVPR 2020) This is the official implementation of RandLA-Net (CVPR2020, Oral

Qingyong 1k Dec 30, 2022
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch

Enformer - Pytorch (wip) Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch. The original tensorflow

Phil Wang 235 Dec 27, 2022
CaFM-pytorch ICCV ACCEPT Introduction of dataset VSD4K

CaFM-pytorch ICCV ACCEPT Introduction of dataset VSD4K Our dataset VSD4K includes 6 popular categories: game, sport, dance, vlog, interview and city.

96 Jul 05, 2022
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.

DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.

Microsoft 8.4k Jan 01, 2023
Building blocks for uncertainty-aware cycle consistency presented at NeurIPS'21.

UncertaintyAwareCycleConsistency This repository provides the building blocks and the API for the work presented in the NeurIPS'21 paper Robustness vi

EML Tübingen 19 Dec 12, 2022
Code release for the paper “Worldsheet Wrapping the World in a 3D Sheet for View Synthesis from a Single Image”, ICCV 2021.

Worldsheet: Wrapping the World in a 3D Sheet for View Synthesis from a Single Image This repository contains the code for the following paper: R. Hu,

Meta Research 37 Jan 04, 2023
PyTorch implementation of the Crafting Better Contrastive Views for Siamese Representation Learning

Crafting Better Contrastive Views for Siamese Representation Learning This is the official PyTorch implementation of the ContrastiveCrop paper: @artic

249 Dec 28, 2022
Article Reranking by Memory-enhanced Key Sentence Matching for Detecting Previously Fact-checked Claims.

MTM This is the official repository of the paper: Article Reranking by Memory-enhanced Key Sentence Matching for Detecting Previously Fact-checked Cla

ICTMCG 13 Sep 17, 2022
TorchGeo is a PyTorch domain library, similar to torchvision, that provides datasets, transforms, samplers, and pre-trained models specific to geospatial data.

TorchGeo is a PyTorch domain library, similar to torchvision, that provides datasets, transforms, samplers, and pre-trained models specific to geospatial data.

Microsoft 1.3k Dec 30, 2022
A collection of Google research projects related to Federated Learning and Federated Analytics.

Federated Research Federated Research is a collection of research projects related to Federated Learning and Federated Analytics. Federated learning i

Google Research 483 Jan 05, 2023
PyTorch Implementation of AnimeGANv2

PyTorch implementation of AnimeGANv2

4k Jan 07, 2023
Mesh TensorFlow: Model Parallelism Made Easier

Mesh TensorFlow - Model Parallelism Made Easier Introduction Mesh TensorFlow (mtf) is a language for distributed deep learning, capable of specifying

1.3k Dec 26, 2022
Systemic Evolutionary Chemical Space Exploration for Drug Discovery

SECSE SECSE: Systemic Evolutionary Chemical Space Explorer Chemical space exploration is a major task of the hit-finding process during the pursuit of

64 Dec 16, 2022
An image processing project uses Viola-jones technique to detect faces and then use SIFT algorithm for recognition.

Attendance_System An image processing project uses Viola-jones technique to detect faces and then use LPB algorithm for recognition. Face Detection Us

8 Jan 11, 2022
Official codes: Self-Supervised Learning by Estimating Twin Class Distribution

TWIST: Self-Supervised Learning by Estimating Twin Class Distributions Codes and pretrained models for TWIST: @article{wang2021self, title={Self-Sup

Bytedance Inc. 85 Dec 15, 2022
Image Captioning using CNN ,LSTM and Attention

Image Captioning using CNN ,LSTM and Attention This is a deeplearning model which tries to summarize an image into a text . Installation Install this

ASUTOSH GHANTO 1 Dec 16, 2021
GANmouflage: 3D Object Nondetection with Texture Fields

GANmouflage: 3D Object Nondetection with Texture Fields Rui Guo1 Jasmine Collins

29 Aug 10, 2022
💡 Learnergy is a Python library for energy-based machine learning models.

Learnergy: Energy-based Machine Learners Welcome to Learnergy. Did you ever reach a bottleneck in your computational experiments? Are you tired of imp

Gustavo Rosa 57 Nov 17, 2022