Regression Metrics Calculation Made easy for tensorflow2 and scikit-learn

Overview

Regression Metrics

Installation

To install the package from the PyPi repository you can execute the following command:

pip install regressionmetrics

If you prefer, you can clone it and run the setup.py file. Use the following commands to get a copy from GitHub and install all dependencies:

git clone https://github.com/ashishpatel26/regressionmetrics.git
cd regressionmetrics
pip install .
  • Mean Absolute Error - sklearn, keras
  • Mean Square Error - sklearn, keras
  • Root Mean Square Error - sklearn, keras
  • Root Mean Square Logarithmic Error - sklearn, keras
  • Root Mean Square Logarithmic Error with negative value handle - sklearn
  • R2 Score - sklearn, keras
  • Adjusted R2 Score - sklearn, keras
  • Mean Absolute Percentage Error - sklearn, keras
  • Mean squared logarithmic Error - sklearn, keras
  • Symmetric mean absolute percentage error - sklearn, keras
  • Normalized Root Mean Squared Error - sklearn, keras

Usage

Usage with scikit learn :

from regressionmetrics.metrics import *

y_true = [3, 0.5, 2, 7]
y_pred = [2.5, 0.0, 2, -8]


print("R2Score: ",r2(y_true, y_pred))
print("Adjusted_R2_Score:",adj_r2(y_true, y_pred))
print("RMSE:", rmse(y_true, y_pred))
print("MAE:",mae(y_true, y_pred))
print("RMSLE with Neg Value:", rmsle_with_negval(y_true, y_pred))
print("MSE:", mse(y_true, y_pred))
print("MAPE: ", mape(y_true, y_pred))

Usage with Tensorflow keras:

from regressionmetrics.keras import *
import pandas as pd
import numpy as np

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers

(x_train, y_train), (x_test, y_test) = tf.keras.datasets.boston_housing.load_data(path="boston_housing.npz", test_split=0.2, seed=113)

model = keras.Sequential([
    layers.Dense(64, activation='relu', input_shape=(x_train.shape[1],)),
    layers.Dense(64, activation='relu'),
    layers.Dense(1)
])
model.compile(optimizer='rmsprop', loss='mse', metrics=[r2, mae, mse, rmse, mape, rmsle, nrmse])
model.fit(x_train, y_train, epochs=10, batch_size=32, validation_data=(x_test, y_test))
Epoch 1/10
 1/13 [=>............................] - ETA: 7s - loss: 1574.7567 - r2: 0.6597 - mae: 37.1803 - mse: 1574.7567 - rmse: 37.1802 - mape: 159.261313/13 [==============================] - 1s 15ms/step - loss: 270.0653 - r2: 0.9472 - mae: 11.5427 - mse: 270.0653 - rmse: 11.5427 - mape: 57.3519 - rmsle: 0.6445 - nrmse: 0.5735 - val_loss: 88.6351 - val_r2: 0.9727 - val_mae: 6.6028 - val_mse: 88.6351 - val_rmse: 6.6028 - val_mape: 29.6502 - val_rmsle: 0.3161 - val_nrmse: 0.2965
Epoch 2/10
 1/13 [=>............................] - ETA: 0s - loss: 74.6623 - r2: 0.9913 - mae: 5.5958 - mse: 74.6623 - rmse: 5.5958 - mape: 25.3655 - rmsl13/13 [==============================] - 0s 3ms/step - loss: 87.1876 - r2: 0.9856 - mae: 6.9466 - mse: 87.1876 - rmse: 6.9466 - mape: 33.4256 - rmsle: 0.3057 - nrmse: 0.3343 - val_loss: 81.7884 - val_r2: 0.9712 - val_mae: 6.6424 - val_mse: 81.7884 - val_rmse: 6.6424 - val_mape: 28.8687 - val_rmsle: 0.3334 - val_nrmse: 0.2887
Epoch 3/10
 1/13 [=>............................] - ETA: 0s - loss: 41.2790 - r2: 0.9722 - mae: 5.3798 - mse: 41.2790 - rmse: 5.3798 - mape: 28.7497 - rmsl13/13 [==============================] - 0s 3ms/step - loss: 103.6462 - r2: 0.9825 - mae: 7.1041 - mse: 103.6462 - rmse: 7.1041 - mape: 34.6278 - rmsle: 0.3231 - nrmse: 0.3463 - val_loss: 71.7539 - val_r2: 0.9769 - val_mae: 6.1455 - val_mse: 71.7539 - val_rmse: 6.1455 - val_mape: 27.5078 - val_rmsle: 0.2893 - val_nrmse: 0.2751
Epoch 4/10
 1/13 [=>............................] - ETA: 0s - loss: 113.6758 - r2: 0.9917 - mae: 6.6575 - mse: 113.6758 - rmse: 6.6575 - mape: 20.8683 - rm13/13 [==============================] - 0s 3ms/step - loss: 88.1601 - r2: 0.9823 - mae: 6.8479 - mse: 88.1601 - rmse: 6.8479 - mape: 32.5867 - rmsle: 0.3080 - nrmse: 0.3259 - val_loss: 63.3707 - val_r2: 0.9829 - val_mae: 6.0845 - val_mse: 63.3707 - val_rmse: 6.0845 - val_mape: 33.1628 - val_rmsle: 0.2747 - val_nrmse: 0.3316
Epoch 5/10
 1/13 [=>............................] - ETA: 0s - loss: 85.8188 - r2: 0.9893 - mae: 7.0097 - mse: 85.8188 - rmse: 7.0097 - mape: 34.8362 - rmsl13/13 [==============================] - 0s 3ms/step - loss: 82.3233 - r2: 0.9860 - mae: 6.5795 - mse: 82.3233 - rmse: 6.5795 - mape: 32.5198 - rmsle: 0.3105 - nrmse: 0.3252 - val_loss: 74.4783 - val_r2: 0.9813 - val_mae: 6.8936 - val_mse: 74.4783 - val_rmse: 6.8936 - val_mape: 41.9492 - val_rmsle: 0.3067 - val_nrmse: 0.4195
Epoch 7/10
 1/13 [=>............................] - ETA: 0s - loss: 105.6430 - r2: 0.9658 - mae: 9.4737 - mse: 105.6430 - rmse: 9.4737 - mape: 53.0854 - rm13/13 [==============================] - 0s 3ms/step - loss: 76.0740 - r2: 0.9856 - mae: 6.4234 - mse: 76.0740 - rmse: 6.4234 - mape: 31.8728 - rmsle: 0.2828 - nrmse: 0.3187 - val_loss: 104.1779 - val_r2: 0.9679 - val_mae: 7.5539 - val_mse: 104.1779 - val_rmse: 7.5539 - val_mape: 30.9401 - val_rmsle: 0.3692 - val_nrmse: 0.3094
Epoch 8/10
 1/13 [=>............................] - ETA: 0s - loss: 100.0114 - r2: 0.9833 - mae: 6.8492 - mse: 100.0114 - rmse: 6.8492 - mape: 27.9621 - rm13/13 [==============================] - 0s 4ms/step - loss: 68.4268 - r2: 0.9892 - mae: 5.9540 - mse: 68.4268 - rmse: 5.9540 - mape: 29.7586 - rmsle: 0.2623 - nrmse: 0.2976 - val_loss: 171.7968 - val_r2: 0.9412 - val_mae: 10.5855 - val_mse: 171.7968 - val_rmse: 10.5855 - val_mape: 47.9010 - val_rmsle: 0.7561 - val_nrmse: 0.4790
Epoch 9/10
 1/13 [=>............................] - ETA: 0s - loss: 291.8670 - r2: 0.9725 - mae: 13.9899 - mse: 291.8670 - rmse: 13.9899 - mape: 61.3658 - 13/13 [==============================] - 0s 3ms/step - loss: 92.3889 - r2: 0.9796 - mae: 6.8932 - mse: 92.3889 - rmse: 6.8932 - mape: 33.2856 - rmsle: 0.3333 - nrmse: 0.3329 - val_loss: 67.2208 - val_r2: 0.9808 - val_mae: 5.8498 - val_mse: 67.2208 - val_rmse: 5.8498 - val_mape: 26.4504 - val_rmsle: 0.2680 - val_nrmse: 0.2645
Epoch 10/10
 1/13 [=>............................] - ETA: 0s - loss: 97.0853 - r2: 0.9923 - mae: 5.9866 - mse: 97.0853 - rmse: 5.9866 - mape: 24.9878 - rmsl13/13 [==============================] - 0s 3ms/step - loss: 78.3823 - r2: 0.9856 - mae: 6.5958 - mse: 78.3823 - rmse: 6.5958 - mape: 32.8136 - rmsle: 0.3025 - nrmse: 0.3281 - val_loss: 69.5314 - val_r2: 0.9787 - val_mae: 6.8302 - val_mse: 69.5314 - val_rmse: 6.8302 - val_mape: 37.3933 - val_rmsle: 0.2974 - val_nrmse: 0.3739

😃 Thanks for reading and forking.

You might also like...
Hitters Linear Regression - Hitters Linear Regression With Python
Hitters Linear Regression - Hitters Linear Regression With Python

Hitters_Linear_Regression Kullanacağımız veri seti Carnegie Mellon Üniversitesi'

A set of tools for creating and testing machine learning features, with a scikit-learn compatible API

Feature Forge This library provides a set of tools that can be useful in many machine learning applications (classification, clustering, regression, e

Using python and scikit-learn to make stock predictions

MachineLearningStocks in python: a starter project and guide EDIT as of Feb 2021: MachineLearningStocks is no longer actively maintained MachineLearni

A real-time speech emotion recognition application using Scikit-learn and gradio
A real-time speech emotion recognition application using Scikit-learn and gradio

Speech-Emotion-Recognition-App A real-time speech emotion recognition application using Scikit-learn and gradio. Requirements librosa==0.6.3 numpy sou

Python package for Bayesian Machine Learning with scikit-learn API
Python package for Bayesian Machine Learning with scikit-learn API

Python package for Bayesian Machine Learning with scikit-learn API Installing & Upgrading package pip install https://github.com/AmazaspShumik/sklearn

A scikit-learn compatible neural network library that wraps PyTorch

A scikit-learn compatible neural network library that wraps PyTorch. Resources Documentation Source Code Examples To see more elaborate examples, look

scikit-learn: machine learning in Python
scikit-learn: machine learning in Python

scikit-learn is a Python module for machine learning built on top of SciPy and is distributed under the 3-Clause BSD license. The project was started

A scikit-learn compatible neural network library that wraps PyTorch

A scikit-learn compatible neural network library that wraps PyTorch. Resources Documentation Source Code Examples To see more elaborate examples, look

A scikit-learn compatible neural network library that wraps PyTorch

A scikit-learn compatible neural network library that wraps PyTorch. Resources Documentation Source Code Examples To see more elaborate examples, look

Comments
  • Very nice toolkit

    Very nice toolkit

    This isn't really an issue. I wanted to thank you for sharing such a nice toolkit for regression tasks with tensorflow

    Do you have a similar toolkit for classification?

    opened by happypanda5 0
Releases(v1.4.0)
  • v1.4.0(Oct 30, 2021)

    • Changelog for v1.4.0 (2022-01-13)

    • Name clashes resolved with keras names
    • Changelog for v1.3.0 (2021-11-18)

    • new regresson metrics are added with details explaination
    • Changelog for v1.2.0 (2021-10-31)

    • Adjusted r2 score error solved
    • Changelog for v1.1.0 (2021-10-31)

    • SomeError solved
    • Changelog for v1.0.0 (2021-10-31)

    • regressionmetrics package first release 1.0.0.
    Source code(tar.gz)
    Source code(zip)
Owner
Ashish Patel
AI Researcher & Senior Data Scientist at Softweb Solutions Avnet Solutions(Fortune 500) | Rank 3 Kaggle Kernel Master
Ashish Patel
https://sites.google.com/cornell.edu/recsys2021tutorial

Counterfactual Learning and Evaluation for Recommender Systems (RecSys'21 Tutorial) Materials for "Counterfactual Learning and Evaluation for Recommen

yuta-saito 45 Nov 10, 2022
[ICCV 2021] A Simple Baseline for Semi-supervised Semantic Segmentation with Strong Data Augmentation

[ICCV 2021] A Simple Baseline for Semi-supervised Semantic Segmentation with Strong Data Augmentation

CodingMan 45 Dec 12, 2022
This is an example of object detection on Micro bacterium tuberculosis using Mask-RCNN

Mask-RCNN on Mycobacterium tuberculosis This is an example of object detection on Mycobacterium Tuberculosis using Mask RCNN. Implement of Mask R-CNN

Jun-En Ding 1 Sep 16, 2021
Official implementation of the ICLR 2021 paper

You Only Need Adversarial Supervision for Semantic Image Synthesis Official PyTorch implementation of the ICLR 2021 paper "You Only Need Adversarial S

Bosch Research 272 Dec 28, 2022
Official code repository for the EMNLP 2021 paper

Integrating Visuospatial, Linguistic and Commonsense Structure into Story Visualization PyTorch code for the EMNLP 2021 paper "Integrating Visuospatia

Adyasha Maharana 23 Dec 19, 2022
NVTabular is a feature engineering and preprocessing library for tabular data designed to quickly and easily manipulate terabyte scale datasets used to train deep learning based recommender systems.

NVTabular is a feature engineering and preprocessing library for tabular data designed to quickly and easily manipulate terabyte scale datasets used to train deep learning based recommender systems.

880 Jan 07, 2023
Hyperbolic Image Segmentation, CVPR 2022

Hyperbolic Image Segmentation, CVPR 2022 This is the implementation of paper Hyperbolic Image Segmentation (CVPR 2022). Repository structure assets :

Mina Ghadimi Atigh 46 Dec 29, 2022
LSTC: Boosting Atomic Action Detection with Long-Short-Term Context

LSTC: Boosting Atomic Action Detection with Long-Short-Term Context This Repository contains the code on AVA of our ACM MM 2021 paper: LSTC: Boosting

Tencent YouTu Research 9 Oct 11, 2022
Training a deep learning model on the noisy CIFAR dataset

Training-a-deep-learning-model-on-the-noisy-CIFAR-dataset This repository contai

1 Jun 14, 2022
Pytorch implementation of various High Dynamic Range (HDR) Imaging algorithms

Deep High Dynamic Range Imaging Benchmark This repository is the pytorch impleme

Tianhong Dai 5 Nov 16, 2022
fastgradio is a python library to quickly build and share gradio interfaces of your trained fastai models.

fastgradio is a python library to quickly build and share gradio interfaces of your trained fastai models.

Ali Abdalla 34 Jan 05, 2023
Pytorch implementation of MLP-Mixer with loading pre-trained models.

MLP-Mixer-Pytorch PyTorch implementation of MLP-Mixer: An all-MLP Architecture for Vision with the function of loading official ImageNet pre-trained p

Qiushi Yang 2 Sep 29, 2022
Collective Multi-type Entity Alignment Between Knowledge Graphs (WWW'20)

CG-MuAlign A reference implementation for "Collective Multi-type Entity Alignment Between Knowledge Graphs", published in WWW 2020. If you find our pa

Bran Zhu 28 Dec 11, 2022
Contrastive Learning for Metagenomic Binning

CLMB A simple framework for CLMB - a novel deep Contrastive Learningfor Metagenomic Binning Created by Pengfei Zhang, senior of Department of Computer

1 Sep 14, 2022
The official PyTorch code for NeurIPS 2021 ML4AD Paper, "Does Thermal data make the detection systems more reliable?"

MultiModal-Collaborative (MMC) Learning Framework for integrating RGB and Thermal spectral modalities This is the official code for NeurIPS 2021 Machi

NeurAI 12 Nov 02, 2022
Code for "Graph-Evolving Meta-Learning for Low-Resource Medical Dialogue Generation". [AAAI 2021]

Graph Evolving Meta-Learning for Low-resource Medical Dialogue Generation Code to be further cleaned... This repo contains the code of the following p

Shuai Lin 29 Nov 01, 2022
Mining-the-Social-Web-3rd-Edition - The official online compendium for Mining the Social Web, 3rd Edition (O'Reilly, 2018)

Mining the Social Web, 3rd Edition The official code repository for Mining the Social Web, 3rd Edition (O'Reilly, 2019). The book is available from Am

Mikhail Klassen 838 Jan 01, 2023
PyTorch Lightning + Hydra. A feature-rich template for rapid, scalable and reproducible ML experimentation with best practices. ⚡🔥⚡

Lightning-Hydra-Template A clean and scalable template to kickstart your deep learning project 🚀 ⚡ 🔥 Click on Use this template to initialize new re

Łukasz Zalewski 2.1k Jan 09, 2023
DynamicViT: Efficient Vision Transformers with Dynamic Token Sparsification

DynamicViT: Efficient Vision Transformers with Dynamic Token Sparsification Created by Yongming Rao, Wenliang Zhao, Benlin Liu, Jiwen Lu, Jie Zhou, Ch

Yongming Rao 414 Jan 01, 2023
TRIQ implementation

TRIQ Implementation TF-Keras implementation of TRIQ as described in Transformer for Image Quality Assessment. Installation Clone this repository. Inst

Junyong You 115 Dec 30, 2022