Create and implement a deep learning library from scratch.

Related tags

Deep LearningARA
Overview

ARA1

In this project, we create and implement a deep learning library from scratch.

Table of Contents

About The Project

Deep learning can be considered as a subset of machine learning. It is a field that is based on learning and improving on its own by examining computer algorithms. Deep learning works with artificial neural networks consisting of many layers. This project, which is creating a Deep Learning Library from scratch, can be further implemented in various kinds of projects that involve Deep Learning. Which include, but are not limited to applications in Image, Natural Language and Speech processing, among others.

Aim

To implement a deep learning library from scratch.

Tech Stack

Technologies used in the project:

  • Python and numpy, pandas, matplotlib
  • Google Colab

File Structure

.
├── code
|   └── main.py                                   #contains the main code for the library
├── resources                                     #Notes 
|   ├── ImprovingDeepNeuralNetworks
|   |   ├── images
|   |   |   ├── BatchvsMiniBatch.png
|   |   |   ├── Bias.png
|   |   |   └── EWG.png
|   |   └── notes.md
|   ├── Course1.md                               
|   ├── accuracy.jpg
|   ├── error.jpg
|   └── grad_des_graph.jpg
├── LICENSE.txt
├── ProjectReport.pdf                            #Project Report
└── README.md                                    #Readme

Approach

The approach of the project is to basically create a deep learning library, as stated before. The aim of the project was to implement various deep learning algorithms, in order to drive a deep neural network and hence,create a deep learning library, which is modular,and driven on user input so that it can be applied for various deep learning processes, and to train and test it against a model.

Theory

A neural network is a network or circuit of neurons, or in a modern sense, an artificial neural network, composed of artificial neurons or nodes.

There are different types of Neural Networks

  • Standard Neural Networks
  • Convolutional Neural Networks
  • Recurring Neural Networks

Loss Function:

Loss function is defined so as to see how good the output ŷ is compared to output label y.

Cost Function :

Cost Function quantifies the error between predicted values and expected values.

Gradient Descent : -

Gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function.

Descent

Getting Started

Prerequisites

  • Object oriented programming in Python

  • Linear Algebra

  • Basic knowledge of Neural Networks

  • Python 3.6 and above

    You can visit the Python Download Guide for the installation steps.

  • Install numpy next

pip install numpy

Installation

  1. Clone the repo
git clone [email protected]:https://github.com/Ris-Bali/ARA.git

Results

Training

We trained a model on the iris dataset using ARA here's the video for the same -

ARA.mp4

As you may have observed we achieved an accuracy of nearly 100% while training the model.

Result

Results obtained during training: error (where Y-axis represents the value of the cost function and X axis represents the number of iterations) accuracy (where Y-axis represents the accuracy of the prediction wrt the labels and X-axis represents the number of iterations)

Future Work

  • Short term
    • Adding class for normalization and regularization
  • Near Future
    • Addition of support for linear regression
    • Addition of classes for LSTM and GRU blocks
  • Future goal
    • Addition of algorithms to support CNN models.
    • Addition of more Machine Learning algorithms
    • Include algorithms to facilitate Image Recognition, Machine Translation and Natural Language Processing

Troubleshooting

  • Numpy library not working so we shifted workspace to colab

Contributors

Acknowledgements

Resources

License

Describe your License for your project.

Owner
Rishabh Bali
Love to learn new stuff
Rishabh Bali
Pytorch implementation of Supporting Clustering with Contrastive Learning, NAACL 2021

Supporting Clustering with Contrastive Learning SCCL (NAACL 2021) Dejiao Zhang, Feng Nan, Xiaokai Wei, Shangwen Li, Henghui Zhu, Kathleen McKeown, Ram

231 Jan 05, 2023
Official Implementation for "ReStyle: A Residual-Based StyleGAN Encoder via Iterative Refinement" https://arxiv.org/abs/2104.02699

ReStyle: A Residual-Based StyleGAN Encoder via Iterative Refinement Recently, the power of unconditional image synthesis has significantly advanced th

967 Jan 04, 2023
An example of time series augmentation methods with Keras

Time Series Augmentation This is a collection of time series data augmentation methods and an example use using Keras. News 2020/04/16: Repository Cre

九州大学 ヒューマンインタフェース研究室 229 Jan 02, 2023
[NeurIPS 2021] Introspective Distillation for Robust Question Answering

Introspective Distillation (IntroD) This repository is the Pytorch implementation of our paper "Introspective Distillation for Robust Question Answeri

Yulei Niu 13 Jul 26, 2022
ManimML is a project focused on providing animations and visualizations of common machine learning concepts with the Manim Community Library.

ManimML ManimML is a project focused on providing animations and visualizations of common machine learning concepts with the Manim Community Library.

259 Jan 04, 2023
Chainer Implementation of Semantic Segmentation using Adversarial Networks

Semantic Segmentation using Adversarial Networks Requirements Chainer (1.23.0) Differences Use of FCN-VGG16 instead of Dilated8 as Segmentor. Caution

Taiki Oyama 99 Jun 28, 2022
Reproducible research and reusable acyclic workflows in Python. Execute code on HPC systems as if you executed them on your personal computer!

Reproducible research and reusable acyclic workflows in Python. Execute code on HPC systems as if you executed them on your machine! Motivation Would

Joeri Hermans 15 Sep 11, 2022
An Open-Source Package for Information Retrieval.

OpenMatch An Open-Source Package for Information Retrieval. 😃 What's New Top Spot on TREC-COVID Challenge (May 2020, Round2) The twin goals of the ch

THUNLP 439 Dec 27, 2022
This repository contains the code for EMNLP-2021 paper "Word-Level Coreference Resolution"

Word-Level Coreference Resolution This is a repository with the code to reproduce the experiments described in the paper of the same name, which was a

79 Dec 27, 2022
Hierarchical Cross-modal Talking Face Generation with Dynamic Pixel-wise Loss (ATVGnet)

Hierarchical Cross-modal Talking Face Generation with Dynamic Pixel-wise Loss (ATVGnet) By Lele Chen , Ross K Maddox, Zhiyao Duan, Chenliang Xu. Unive

Lele Chen 218 Dec 27, 2022
A PyTorch implementation of the continual learning experiments with deep neural networks

Brain-Inspired Replay A PyTorch implementation of the continual learning experiments with deep neural networks described in the following paper: Brain

182 Dec 27, 2022
Latent Network Models to Account for Noisy, Multiply-Reported Social Network Data

VIMuRe Latent Network Models to Account for Noisy, Multiply-Reported Social Network Data. If you use this code please cite this article (preprint). De

6 Dec 15, 2022
Official PyTorch implementation of N-ImageNet: Towards Robust, Fine-Grained Object Recognition with Event Cameras (ICCV 2021)

N-ImageNet: Towards Robust, Fine-Grained Object Recognition with Event Cameras Official PyTorch implementation of N-ImageNet: Towards Robust, Fine-Gra

32 Dec 26, 2022
Kroomsa: A search engine for the curious

Kroomsa A search engine for the curious. It is a search algorithm designed to en

Wingify 7 Jun 20, 2022
Python implementation of 3D facial mesh exaggeration using the techniques described in the paper: Computational Caricaturization of Surfaces.

Python implementation of 3D facial mesh exaggeration using the techniques described in the paper: Computational Caricaturization of Surfaces.

Wonjong Jang 8 Nov 01, 2022
Light-Head R-CNN

Light-head R-CNN Introduction We release code for Light-Head R-CNN. This is my best practice for my research. This repo is organized as follows: light

jemmy li 835 Dec 06, 2022
Why Are You Weird? Infusing Interpretability in Isolation Forest for Anomaly Detection

Why, hello there! This is the supporting notebook for the research paper — Why Are You Weird? Infusing Interpretability in Isolation Forest for Anomal

2 Dec 14, 2021
Unsupervised Representation Learning by Invariance Propagation

Unsupervised Learning by Invariance Propagation This repository is the official implementation of Unsupervised Learning by Invariance Propagation. Pre

FengWang 15 Jul 06, 2022
Official implementation of Few-Shot and Continual Learning with Attentive Independent Mechanisms

Few-Shot and Continual Learning with Attentive Independent Mechanisms This repository is the official implementation of Few-Shot and Continual Learnin

Chikan_Huang 25 Dec 08, 2022
Practical Blind Denoising via Swin-Conv-UNet and Data Synthesis

Practical Blind Denoising via Swin-Conv-UNet and Data Synthesis [Paper] [Online Demo] The following results are obtained by our SCUNet with purely syn

Kai Zhang 312 Jan 07, 2023