Simplified interface for TensorFlow (mimicking Scikit Learn) for Deep Learning

Related tags

Deep Learningskflow
Overview

SkFlow has been moved to Tensorflow.

SkFlow has been moved to http://github.com/tensorflow/tensorflow into contrib folder specifically located here. The development will continue there. Please submit any issues and pull requests to Tensorflow repository instead.

This repository will ramp down, including after next Tensorflow release we will wind down code here. Please see instructions on most recent installation here.

Comments
  • How do I do multilabel image classification?

    How do I do multilabel image classification?

    Do I have to make changes in the multioutput file? I ideally want to train any model, like Inception, on my training data which has multi labels. How do I do that?

    help wanted examples 
    opened by unography 21
  • Add early stopping and reporting based on validation data

    Add early stopping and reporting based on validation data

    This PR allows a user to specify a validation dataset that are used for early stopping (and reporting). The PR was created to address issue 85

    I made changes in 3 places.

    1. The trainer now takes a dictionary containing the validation data (in the same format as the output of the data feeder's get_dict_fn).
    2. The fit method now takes arguments for val_X and val_y. It converts these into the correct format for the trainer.
    3. The example file digits.py now uses early stopping, by supplying val_X and val_y.

    I can add early stopping to other examples if this approach looks good, though their behavior should not otherwise be affected by the current PR.

    cla: yes 
    opened by dansbecker 14
  • Class weight support

    Class weight support

    Hi,

    I am using skflow.ops.dnn to classify two - classes dataset (True and False). The percentage of True example is very small, so I have an imbalanced dataset.

    It seems to me that one way to resolve the issue is to use weighted classes. However, when I look to the implementation of skflow.ops.dnn, I do not know how could I do weighted classes with DNN.

    Is it possible to do that with skflow, or is there another technique to deal with imbalanced dataset problem in skflow?

    Thanks

    enhancement 
    opened by vinhqdang 13
  • Added verbose option

    Added verbose option

    I added an option to control the "verbosity". For this, I added the parameter "verbose" in the init method of the init.py file and to the train function in the trainers.py file. In addition, I passed this argument to the "self._trainer.train()" call in the init file and added a condition to make the prints in the trainer.py file.

    cla: no 
    opened by ivallesp 12
  • Predict batch size default

    Predict batch size default

    This changes the default batch size for prediction to be the same as for training, enabling efficient grid search. Previously GridSearchCV would try to make predictions in a single batch, which could take a lot of memory.

    This also adds a simple example of using skflow with GridSearchCV.

    cla: no 
    opened by mheilman 11
  • Add example accessing of weights

    Add example accessing of weights

    It wasn't clear how to access weights using classifier.get_tensor_value('foo') syntax. This adds some examples for the CNN model. They were figured out by logging the training as though for using TensorBoard, and then running strings on the logfile to look for the right namespace.

    Is there a better way to access these weights? Or to learn their names? The logging must walk through the graph and record these names. Maybe if there were a way to quickly list all the names, that'd be enough for advanced users to figure it out.

    cla: yes 
    opened by dvbuntu 10
  • Plotting neural network built by skflow

    Plotting neural network built by skflow

    Hi,

    Sorry I asked too much.

    I think plotting is always a nice feature. Is it possible right now for skflow (or can we do that through tensorflow directly)?

    opened by vinhqdang 10
  • move monitor and logdir arguments to init

    move monitor and logdir arguments to init

    opened by mheilman 8
  • Exception when running language model example

    Exception when running language model example

    Hi,

    Thanks for making this tool. It will definitely make things easier for NN newcomers.

    I just tried running your language model example and got the following exception:

    Traceback (most recent call last):
      File "test.py", line 84, in <module>
        estimator.fit(X, y)
      File "/Users/aleksandar/tensorflow/lib/python3.5/site-packages/skflow/estimators/base.py", line 243, in fit
        feed_params_fn=self._data_feeder.get_feed_params)
      File "/Users/aleksandar/tensorflow/lib/python3.5/site-packages/skflow/trainer.py", line 114, in train
        feed_dict = feed_dict_fn()
      File "/Users/aleksandar/tensorflow/lib/python3.5/site-packages/skflow/io/data_feeder.py", line 307, in _feed_dict_fn
        inp[i, :] = six.next(self.X)
    StopIteration
    

    I made sure that my python distribution has the correct version of six. I tried running it both in a virtual environment and in a normal Python 3 distro. Any ideas what might be causing this?

    opened by savkov 7
  • another ValidationMonitor with validation(+early stopping) per epoch

    another ValidationMonitor with validation(+early stopping) per epoch

    From what I understand, the existing ValidationMonitor performs validation every [print_steps] steps, and checks for stop condition every [early_stopping_rounds] steps. I'd like to add another ValidationMonitor that performs validation once and checks for stoping condition once every epoch. Is this the recommended practice in machine learning regarding validation and early stopping? I mean I'd like to add a fit process something like this:

    def fit(self, x_train, y_train, x_validate, y_validate):
        while (current_validation_loss < previous_validation_loss):
            estimator.train_one_more_epoch(x_train, y_train)
            previous_validation_loss = current_validation_loss
            current_validation_loss = some_error(y_validate, estimator.predict(x_validate))
    
    enhancement help wanted 
    opened by alanyuchenhou 7
  • Example of language model

    Example of language model

    Add an example of language model (RNN). For example character level on sheikspear book (similar to https://github.com/sherjilozair/char-rnn-tensorflow).

    examples 
    opened by ilblackdragon 7
  • .travis.yml: The 'sudo' tag is now deprecated in Travis CI

    .travis.yml: The 'sudo' tag is now deprecated in Travis CI

    opened by cclauss 1
  • Why hasn't this repo been archived yet?

    Why hasn't this repo been archived yet?

    New versions of TF have already been released since the last commit to this repo. As far as I've understood, after having read the README file of this project, you intended to close this repo. So, why hasn't it been done yet?

    opened by nbro 0
Releases(v0.1)
  • v0.1(Feb 14, 2016)

Genshin-assets - 👧 Public documentation & static assets for Genshin Impact data.

genshin-assets This repo provides easy access to the Genshin Impact assets, primarily for use on static sites. Sources Genshin Optimizer - An Artifact

Zerite Development 5 Nov 22, 2022
DiscoNet: Learning Distilled Collaboration Graph for Multi-Agent Perception [NeurIPS 2021]

DiscoNet: Learning Distilled Collaboration Graph for Multi-Agent Perception [NeurIPS 2021] Yiming Li, Shunli Ren, Pengxiang Wu, Siheng Chen, Chen Feng

Automation and Intelligence for Civil Engineering (AI4CE) Lab @ NYU 98 Dec 21, 2022
Official code repository for the EMNLP 2021 paper

Integrating Visuospatial, Linguistic and Commonsense Structure into Story Visualization PyTorch code for the EMNLP 2021 paper "Integrating Visuospatia

Adyasha Maharana 23 Dec 19, 2022
Implementation of our NeurIPS 2021 paper "A Bi-Level Framework for Learning to Solve Combinatorial Optimization on Graphs".

PPO-BiHyb This is the official implementation of our NeurIPS 2021 paper "A Bi-Level Framework for Learning to Solve Combinatorial Optimization on Grap

<a href=[email protected]"> 66 Nov 23, 2022
Implementation of character based convolutional neural network

Character Based CNN This repo contains a PyTorch implementation of a character-level convolutional neural network for text classification. The model a

Ahmed BESBES 248 Nov 21, 2022
LEAP: Learning Articulated Occupancy of People

LEAP: Learning Articulated Occupancy of People Paper | Video | Project Page This is the official implementation of the CVPR 2021 submission LEAP: Lear

Neural Bodies 60 Nov 18, 2022
Efficient training of deep recommenders on cloud.

HybridBackend Introduction HybridBackend is a training framework for deep recommenders which bridges the gap between evolving cloud infrastructure and

Alibaba 111 Dec 23, 2022
AITUS - An atomatic notr maker for CYTUS

AITUS an automatic note maker for CYTUS. 利用AI根据指定乐曲生成CYTUS游戏谱面。 效果展示:https://www

GradiusTwinbee 6 Feb 24, 2022
[CVPR 2022] Thin-Plate Spline Motion Model for Image Animation.

[CVPR2022] Thin-Plate Spline Motion Model for Image Animation Source code of the CVPR'2022 paper "Thin-Plate Spline Motion Model for Image Animation"

yoyo-nb 1.4k Dec 30, 2022
Joint Channel and Weight Pruning for Model Acceleration on Mobile Devices

Joint Channel and Weight Pruning for Model Acceleration on Mobile Devices Abstract For practical deep neural network design on mobile devices, it is e

11 Dec 30, 2022
PyTorch implementation of UNet++ (Nested U-Net).

PyTorch implementation of UNet++ (Nested U-Net) This repository contains code for a image segmentation model based on UNet++: A Nested U-Net Architect

4ui_iurz1 642 Jan 04, 2023
A general-purpose encoder-decoder framework for Tensorflow

READ THE DOCUMENTATION CONTRIBUTING A general-purpose encoder-decoder framework for Tensorflow that can be used for Machine Translation, Text Summariz

Google 5.5k Jan 07, 2023
DIRL: Domain-Invariant Representation Learning

DIRL: Domain-Invariant Representation Learning Domain-Invariant Representation Learning (DIRL) is a novel algorithm that semantically aligns both the

Ajay Tanwani 30 Nov 07, 2022
A 35mm camera, based on the Canonet G-III QL17 rangefinder, simulated in Python.

c is for Camera A 35mm camera, based on the Canonet G-III QL17 rangefinder, simulated in Python. The purpose of this project is to explore and underst

Daniele Procida 146 Sep 26, 2022
This repository contains code to train and render Mixture of Volumetric Primitives (MVP) models

Mixture of Volumetric Primitives -- Training and Evaluation This repository contains code to train and render Mixture of Volumetric Primitives (MVP) m

Meta Research 125 Dec 29, 2022
[ICML 2021] A fast algorithm for fitting robust decision trees.

GROOT: Growing Robust Trees Growing Robust Trees (GROOT) is an algorithm that fits binary classification decision trees such that they are robust agai

Cyber Analytics Lab 17 Nov 21, 2022
An implementation demo of the ICLR 2021 paper Neural Attention Distillation: Erasing Backdoor Triggers from Deep Neural Networks in PyTorch.

Neural Attention Distillation This is an implementation demo of the ICLR 2021 paper Neural Attention Distillation: Erasing Backdoor Triggers from Deep

Yige-Li 84 Jan 04, 2023
Learning based AI for playing multi-round Koi-Koi hanafuda card games. Have fun.

Koi-Koi AI Learning based AI for playing multi-round Koi-Koi hanafuda card games. Platform Python PyTorch PySimpleGUI (for the interface playing vs AI

Sanghai Guan 10 Nov 20, 2022
torchsummaryDynamic: support real FLOPs calculation of dynamic network or user-custom PyTorch ops

torchsummaryDynamic Improved tool of torchsummaryX. torchsummaryDynamic support real FLOPs calculation of dynamic network or user-custom PyTorch ops.

Bohong Chen 1 Jan 07, 2022
Projects of Andfun Yangon

AndFunYangon Projects of Andfun Yangon First Commit We can use gsearch.py to sea

Htin Aung Lu 1 Dec 28, 2021