Machine learning, in numpy

Overview

numpy-ml

Ever wish you had an inefficient but somewhat legible collection of machine learning algorithms implemented exclusively in NumPy? No?

Installation

For rapid experimentation

To use this code as a starting point for ML prototyping / experimentation, just clone the repository, create a new virtualenv, and start hacking:

$ git clone https://github.com/ddbourgin/numpy-ml.git
$ cd numpy-ml && virtualenv npml && source npml/bin/activate
$ pip3 install -r requirements-dev.txt

As a package

If you don't plan to modify the source, you can also install numpy-ml as a Python package: pip3 install -u numpy_ml.

The reinforcement learning agents train on environments defined in the OpenAI gym. To install these alongside numpy-ml, you can use pip3 install -u 'numpy_ml[rl]'.

Documentation

For more details on the available models, see the project documentation.

Available models

  1. Gaussian mixture model

    • EM training
  2. Hidden Markov model

    • Viterbi decoding
    • Likelihood computation
    • MLE parameter estimation via Baum-Welch/forward-backward algorithm
  3. Latent Dirichlet allocation (topic model)

    • Standard model with MLE parameter estimation via variational EM
    • Smoothed model with MAP parameter estimation via MCMC
  4. Neural networks

    • Layers / Layer-wise ops
      • Add
      • Flatten
      • Multiply
      • Softmax
      • Fully-connected/Dense
      • Sparse evolutionary connections
      • LSTM
      • Elman-style RNN
      • Max + average pooling
      • Dot-product attention
      • Embedding layer
      • Restricted Boltzmann machine (w. CD-n training)
      • 2D deconvolution (w. padding and stride)
      • 2D convolution (w. padding, dilation, and stride)
      • 1D convolution (w. padding, dilation, stride, and causality)
    • Modules
      • Bidirectional LSTM
      • ResNet-style residual blocks (identity and convolution)
      • WaveNet-style residual blocks with dilated causal convolutions
      • Transformer-style multi-headed scaled dot product attention
    • Regularizers
      • Dropout
    • Normalization
      • Batch normalization (spatial and temporal)
      • Layer normalization (spatial and temporal)
    • Optimizers
      • SGD w/ momentum
      • AdaGrad
      • RMSProp
      • Adam
    • Learning Rate Schedulers
      • Constant
      • Exponential
      • Noam/Transformer
      • Dlib scheduler
    • Weight Initializers
      • Glorot/Xavier uniform and normal
      • He/Kaiming uniform and normal
      • Standard and truncated normal
    • Losses
      • Cross entropy
      • Squared error
      • Bernoulli VAE loss
      • Wasserstein loss with gradient penalty
      • Noise contrastive estimation loss
    • Activations
      • ReLU
      • Tanh
      • Affine
      • Sigmoid
      • Leaky ReLU
      • ELU
      • SELU
      • Exponential
      • Hard Sigmoid
      • Softplus
    • Models
      • Bernoulli variational autoencoder
      • Wasserstein GAN with gradient penalty
      • word2vec encoder with skip-gram and CBOW architectures
    • Utilities
      • col2im (MATLAB port)
      • im2col (MATLAB port)
      • conv1D
      • conv2D
      • deconv2D
      • minibatch
  5. Tree-based models

    • Decision trees (CART)
    • [Bagging] Random forests
    • [Boosting] Gradient-boosted decision trees
  6. Linear models

    • Ridge regression
    • Logistic regression
    • Ordinary least squares
    • Bayesian linear regression w/ conjugate priors
      • Unknown mean, known variance (Gaussian prior)
      • Unknown mean, unknown variance (Normal-Gamma / Normal-Inverse-Wishart prior)
  7. n-Gram sequence models

    • Maximum likelihood scores
    • Additive/Lidstone smoothing
    • Simple Good-Turing smoothing
  8. Multi-armed bandit models

    • UCB1
    • LinUCB
    • Epsilon-greedy
    • Thompson sampling w/ conjugate priors
      • Beta-Bernoulli sampler
    • LinUCB
  9. Reinforcement learning models

    • Cross-entropy method agent
    • First visit on-policy Monte Carlo agent
    • Weighted incremental importance sampling Monte Carlo agent
    • Expected SARSA agent
    • TD-0 Q-learning agent
    • Dyna-Q / Dyna-Q+ with prioritized sweeping
  10. Nonparameteric models

    • Nadaraya-Watson kernel regression
    • k-Nearest neighbors classification and regression
    • Gaussian process regression
  11. Matrix factorization

    • Regularized alternating least-squares
    • Non-negative matrix factorization
  12. Preprocessing

    • Discrete Fourier transform (1D signals)
    • Discrete cosine transform (type-II) (1D signals)
    • Bilinear interpolation (2D signals)
    • Nearest neighbor interpolation (1D and 2D signals)
    • Autocorrelation (1D signals)
    • Signal windowing
    • Text tokenization
    • Feature hashing
    • Feature standardization
    • One-hot encoding / decoding
    • Huffman coding / decoding
    • Term frequency-inverse document frequency (TF-IDF) encoding
    • MFCC encoding
  13. Utilities

    • Similarity kernels
    • Distance metrics
    • Priority queue
    • Ball tree
    • Discrete sampler
    • Graph processing and generators

Contributing

Am I missing your favorite model? Is there something that could be cleaner / less confusing? Did I mess something up? Submit a PR! The only requirement is that your models are written with just the Python standard library and NumPy. The SciPy library is also permitted under special circumstances ;)

See full contributing guidelines here.

🌎 The Modern Declarative Data Flow Framework for the AI Empowered Generation.

🌎 JSONClasses JSONClasses is a declarative data flow pipeline and data graph framework. Official Website: https://www.jsonclasses.com Official Docume

Fillmula Inc. 53 Dec 09, 2022
Official pytorch implementation of paper "Inception Convolution with Efficient Dilation Search" (CVPR 2021 Oral).

IC-Conv This repository is an official implementation of the paper Inception Convolution with Efficient Dilation Search. Getting Started Download Imag

Jie Liu 111 Dec 31, 2022
A PyTorch Implementation of "Neural Arithmetic Logic Units"

Neural Arithmetic Logic Units [WIP] This is a PyTorch implementation of Neural Arithmetic Logic Units by Andrew Trask, Felix Hill, Scott Reed, Jack Ra

Kevin Zakka 181 Nov 18, 2022
The toolkit to generate auto labeled datasets

Ozeu Ozeu is the toolkit to autolabal dataset for instance segmentation. You can generate datasets labaled with segmentation mask and bounding box fro

Xiong Jie 28 Mar 28, 2022
Code for the paper "SmoothMix: Training Confidence-calibrated Smoothed Classifiers for Certified Robustness" (NeurIPS 2021)

SmoothMix: Training Confidence-calibrated Smoothed Classifiers for Certified Robustness (NeurIPS2021) This repository contains code for the paper "Smo

Jongheon Jeong 17 Dec 27, 2022
Hyperbolic Hierarchical Clustering.

Hyperbolic Hierarchical Clustering (HypHC) This code is the official PyTorch implementation of the NeurIPS 2020 paper: From Trees to Continuous Embedd

HazyResearch 154 Dec 15, 2022
A light and fast one class detection framework for edge devices. We provide face detector, head detector, pedestrian detector, vehicle detector......

A Light and Fast Face Detector for Edge Devices Big News: LFD, which is a big update of LFFD, now is released (2021.03.09). It is strongly recommended

YonghaoHe 1.3k Dec 25, 2022
Fake News Detection Using Machine Learning Methods

Fake-News-Detection-Using-Machine-Learning-Methods Fake news is always a real and dangerous issue. However, with the presence and abundance of various

Achraf Safsafi 1 Jan 11, 2022
Our CIKM21 Paper "Incorporating Query Reformulating Behavior into Web Search Evaluation"

Reformulation-Aware-Metrics Introduction This codebase contains source-code of the Python-based implementation of our CIKM 2021 paper. Chen, Jia, et a

xuanyuan14 5 Mar 05, 2022
PyTorch Code of "Memory In Memory: A Predictive Neural Network for Learning Higher-Order Non-Stationarity from Spatiotemporal Dynamics"

Memory In Memory Networks It is based on the paper Memory In Memory: A Predictive Neural Network for Learning Higher-Order Non-Stationarity from Spati

Yang Li 12 May 30, 2022
Real-ESRGAN aims at developing Practical Algorithms for General Image Restoration.

Real-ESRGAN Colab Demo for Real-ESRGAN . Portable Windows executable file. You can find more information here. Real-ESRGAN aims at developing Practica

Xintao 17.2k Jan 02, 2023
Learning What and Where to Draw

###Learning What and Where to Draw Scott Reed, Zeynep Akata, Santosh Mohan, Samuel Tenka, Bernt Schiele, Honglak Lee This is the code for our NIPS 201

Scott Ellison Reed 337 Nov 18, 2022
Code and data accompanying our SVRHM'21 paper.

Code and data accompanying our SVRHM'21 paper. Requires tensorflow 1.13, python 3.7, scikit-learn, and pytorch 1.6.0 to be installed. Python scripts i

5 Nov 17, 2021
Create and implement a deep learning library from scratch.

In this project, we create and implement a deep learning library from scratch. Table of Contents Deep Leaning Library Table of Contents About The Proj

Rishabh Bali 22 Aug 23, 2022
Official implementation of the paper Label-Efficient Semantic Segmentation with Diffusion Models

Label-Efficient Semantic Segmentation with Diffusion Models Official implementation of the paper Label-Efficient Semantic Segmentation with Diffusion

Yandex Research 355 Jan 06, 2023
The implementation of DeBERTa

DeBERTa: Decoding-enhanced BERT with Disentangled Attention This repository is the official implementation of DeBERTa: Decoding-enhanced BERT with Dis

Microsoft 1.2k Jan 06, 2023
ruptures: change point detection in Python

Welcome to ruptures ruptures is a Python library for off-line change point detection. This package provides methods for the analysis and segmentation

Charles T. 1.1k Jan 03, 2023
Deeply Supervised, Layer-wise Prediction-aware (DSLP) Transformer for Non-autoregressive Neural Machine Translation

Non-Autoregressive Translation with Layer-Wise Prediction and Deep Supervision Training Efficiency We show the training efficiency of our DSLP model b

Chenyang Huang 36 Oct 31, 2022
Sign Language Transformers (CVPR'20)

Sign Language Transformers (CVPR'20) This repo contains the training and evaluation code for the paper Sign Language Transformers: Sign Language Trans

Necati Cihan Camgoz 164 Dec 30, 2022
这是一个利用facenet和retinaface实现人脸识别的库,可以进行在线的人脸识别。

Facenet+Retinaface:人脸识别模型在Keras当中的实现 目录 注意事项 Attention 所需环境 Environment 文件下载 Download 预测步骤 How2predict 参考资料 Reference 注意事项 该库中包含了两个网络,分别是retinaface和fa

Bubbliiiing 31 Nov 15, 2022