Ladder network is a deep learning algorithm that combines supervised and unsupervised learning

Overview

This repository contains source code for the experiments in a paper titled Semi-Supervised Learning with Ladder Networks by A Rasmus, H Valpola, M Honkala, M Berglund, and T Raiko.

Required libraries

Install Theano, Blocks Stable 0.2, Fuel Stable 0.2

Refer to the Blocks installation instructions for details but use tag v0.2 instead. Something along:

pip install git+git://github.com/mila-udem/[email protected]
pip install git+git://github.com/mila-udem/[email protected]

Fuel comes with Blocks, but you need to download and convert the datasets. Refer to the Fuel documentation. One might need to rename the converted files.

fuel-download mnist
fuel-convert mnist --dtype float32
fuel-download cifar10
fuel-convert cifar10
Alternatively, one can use the environment.yml file that is provided in this repo to create an conda environment.
  1. First install anaconda from https://www.continuum.io/downloads. Then,
  2. conda env create -f environment.yml
  3. source activate ladder
  4. The environment should be good to go!

Models in the paper

The following commands train the models with seed 1. The reported numbers in the paper are averages over several random seeds. These commands use all the training samples for training (--unlabeled-samples 60000) and none are used for validation. This results in a lot of NaNs being printed during the trainining, since the validation statistics are not available. If you want to observe the validation error and costs during the training, use --unlabeled-samples 50000.

MNIST all labels
# Full
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec gauss --denoising-cost-x 1000,1,0.01,0.01,0.01,0.01,0.01 --labeled-samples 60000 --unlabeled-samples 60000 --seed 1 -- mnist_all_full
# Bottom
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec gauss --denoising-cost-x 2000,0,0,0,0,0,0 --labeled-samples 60000 --unlabeled-samples 60000 --seed 1 -- mnist_all_bottom
# Gamma model
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec 0-0-0-0-0-0-gauss --denoising-cost-x 0,0,0,0,0,0,2 --labeled-samples 60000 --unlabeled-samples 60000 --seed 1 -- mnist_all_gamma
# Supervised baseline
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec 0-0-0-0-0-0-0 --denoising-cost-x 0,0,0,0,0,0,0 --labeled-samples 60000 --unlabeled-samples 60000 --f-local-noise-std 0.5 --seed 1 -- mnist_all_baseline
MNIST 100 labels
# Full
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec gauss --denoising-cost-x 1000,10,0.1,0.1,0.1,0.1,0.1 --labeled-samples 100 --unlabeled-samples 60000 --seed 1 -- mnist_100_full
# Bottom-only
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec gauss --denoising-cost-x 5000,0,0,0,0,0,0 --labeled-samples 100 --unlabeled-samples 60000 --seed 1 -- mnist_100_bottom
# Gamma
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec 0-0-0-0-0-0-gauss --denoising-cost-x 0,0,0,0,0,0,0.5 --labeled-samples 100 --unlabeled-samples 60000 --seed 1 -- mnist_100_gamma
# Supervised baseline
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec 0-0-0-0-0-0-0 --denoising-cost-x 0,0,0,0,0,0,0 --labeled-samples 100 --unlabeled-samples 60000 --f-local-noise-std 0.5 --seed 1 -- mnist_100_baseline
MNIST 1000 labels
# Full
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec gauss --denoising-cost-x 2000,20,0.1,0.1,0.1,0.1,0.1 --f-local-noise-std 0.2 --labeled-samples 1000 --unlabeled-samples 60000 --seed 1 -- mnist_1000_full
# Bottom-only
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec gauss --denoising-cost-x 2000,0,0,0,0,0,0 --labeled-samples 1000 --unlabeled-samples 60000 --seed 1 -- mnist_1000_bottom
# Gamma model
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec 0-0-0-0-0-0-gauss --denoising-cost-x 0,0,0,0,0,0,10 --labeled-samples 1000 --unlabeled-samples 60000 --seed 1 -- mnist_1000_gamma
# Supervised baseline
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec 0-0-0-0-0-0-0 --denoising-cost-x 0,0,0,0,0,0,0 --labeled-samples 1000 --unlabeled-samples 60000 --f-local-noise-std 0.5 --seed 1 -- mnist_1000_baseline
MNIST 50 labels
# Full model
run.py train --encoder-layers 1000-500-250-250-250-10 --decoder-spec gauss --denoising-cost-x 2000,20,0.1,0.1,0.1,0.1,0.1 --labeled-samples 50 --unlabeled-samples 60000 --seed 1 -- mnist_50_full
MNIST convolutional models
# Conv-FC
run.py train --encoder-layers convv:1000:26:1:1-convv:500:1:1:1-convv:250:1:1:1-convv:250:1:1:1-convv:250:1:1:1-convv:10:1:1:1-globalmeanpool:0 --decoder-spec gauss --denoising-cost-x 1000,10,0.1,0.1,0.1,0.1,0.1,0.1 --labeled-samples 100 --unlabeled-samples 60000 --seed 1 -- mnist_100_conv_fc
# Conv-Small, Gamma
run.py train --encoder-layers convf:32:5:1:1-maxpool:2:2-convv:64:3:1:1-convf:64:3:1:1-maxpool:2:2-convv:128:3:1:1-convv:10:1:1:1-globalmeanpool:6:6-fc:10 --decoder-spec 0-0-0-0-0-0-0-0-0-gauss --denoising-cost-x 0,0,0,0,0,0,0,0,0,1 --labeled-samples 100 --unlabeled-samples 60000 --seed 1  -- mnist_100_conv_gamma
# Conv-Small, supervised baseline. Overfits easily, so keep training short.
run.py train --encoder-layers convf:32:5:1:1-maxpool:2:2-convv:64:3:1:1-convf:64:3:1:1-maxpool:2:2-convv:128:3:1:1-convv:10:1:1:1-globalmeanpool:6:6-fc:10 --decoder-spec 0-0-0-0-0-0-0-0-0-0 --denoising-cost-x 0,0,0,0,0,0,0,0,0,0 --num-epochs 20 --lrate-decay 0.5 --f-local-noise-std 0.45 --labeled-samples 100 --unlabeled-samples 60000 --seed 1 -- mnist_100_conv_baseline
CIFAR models
# Conv-Large, Gamma
./run.py train --encoder-layers convv:96:3:1:1-convf:96:3:1:1-convf:96:3:1:1-maxpool:2:2-convv:192:3:1:1-convf:192:3:1:1-convv:192:3:1:1-maxpool:2:2-convv:192:3:1:1-convv:192:1:1:1-convv:10:1:1:1-globalmeanpool:0 --decoder-spec 0-0-0-0-0-0-0-0-0-0-0-0-gauss --dataset cifar10 --act leakyrelu --denoising-cost-x 0,0,0,0,0,0,0,0,0,0,0,0,4.0 --num-epochs 70 --lrate-decay 0.86 --seed 1 --whiten-zca 3072 --contrast-norm 55 --top-c False --labeled-samples 4000 --unlabeled-samples 50000 -- cifar_4k_gamma
# Conv-Large, supervised baseline. Overfits easily, so keep training short.
./run.py train --encoder-layers convv:96:3:1:1-convf:96:3:1:1-convf:96:3:1:1-maxpool:2:2-convv:192:3:1:1-convf:192:3:1:1-convv:192:3:1:1-maxpool:2:2-convv:192:3:1:1-convv:192:1:1:1-convv:10:1:1:1-globalmeanpool:0 --decoder-spec 0-0-0-0-0-0-0-0-0-0-0-0-0 --dataset cifar10 --act leakyrelu --denoising-cost-x 0,0,0,0,0,0,0,0,0,0,0,0,0 --num-epochs 20 --lrate-decay 0.5 --seed 1 --whiten-zca 3072 --contrast-norm 55 --top-c False --labeled-samples 4000 --unlabeled-samples 50000 -- cifar_4k_baseline
Evaluating models with testset

After training a model, you can infer the results on a test set by performing the evaluate command. An example use after training a model:

./run.py evaluate results/mnist_all_bottom0
Owner
Curious AI
Deep good. Unsupervised better.
Curious AI
狼人杀,线下面杀用,服务端语音播报,浏览器操作,移动端友好。不再需要真人法官~

Wolf 狼人杀面杀法官系统 Preview 如何使用 安装 Python 3.5.2 版本及以上(PyWebIO 要求) pip install -r requirements.txt python main.py 所有玩家访问 Web 服务 TODO,欢迎PR TTS 目前仅支持 macOS 未

Lake Chan 33 Nov 11, 2022
pyLodeRunner - Classic Lode Runner clone made in pyxel (Python)

pyLodeRunner Classic Lode Runner clone made in pyxel (Python) Controls arrow key : move the player X : dig right side Z : dig left side ESC : quit gam

2 Feb 12, 2022
Lutris desktop client in Python / PyGObject

Lutris Lutris is an open source gaming platform that makes gaming on Linux easier by managing, installing and providing optimal settings for games. Lu

Lutris 6.1k Dec 30, 2022
Continuous form of the game Wits & Wagers

wager Continuous form of the game Wits & Wagers Requires: Pygame, Pygame_gui

1 Nov 22, 2021
WIP python/pygame 2D zombie shooter

2d-shooter project A single/multiplayer co-op survival small space zombie shooter. If you'd like to contribute, feel free to join the discord! INSTALL

36 Dec 08, 2022
Discord based board game, Sneks and Ladders

sneks-and-ladders Discord based board game, Sneks and Ladders intro This is the code-base for the Discord based game, Sneks and Ladders, as used for s

Yohei Nakajima 3 Nov 04, 2022
Cricket game using PYQT

Cricket-game-using-PYQT This is a Fantasy cricket Desktop application build in p

Sanket Mane 1 Jan 03, 2022
This is a python interactive story game that I made to show off what I've learnt in python coding for a month

Purpose The files in this repository are for that of a story game created with python version 3.8.5 The purpose of this project was to get familiar wi

0 Dec 30, 2021
Python Interactive Mini Games

Python Interactive Mini Games Mini projects from Coursera's An Introduction to I

Ashish Choudhary 1 Jan 16, 2022
Chess Game using Python

Chess Game is a single-player game where the objective is same as the original chess game. You just need to place your chess piece in a correct position. The purpose of the system is to provide some

Yogesh Selvarajan 1 Aug 15, 2022
Overview: copain, your friendly AI framework to learn and play games

Overview: copain, your friendly AI framework to learn and play games Interface fceux with python and run reinforcement learning. Compatibility Current

fcharras 1 Dec 16, 2021
A Pygame Hangman Game coded in Python 3. Run Hangman.py in a terminal if you have Python 3

Hangman A Pygame Hangman Game coded in Python 3. Run python3 Hangman.py in a terminal if you have Python 3.

1 Dec 24, 2022
Ghdl-interactive-sim - Interactive GHDL simulation of a VHDL adder using Python, Cocotb, and pygame

GHDL Interactive Simulation This is an interactive test bench for a simple VHDL adder. It uses GHDL to elaborate/run the simulation. It is coded in Py

Chuck Benedict 2 Aug 11, 2022
I automated the lumberjack game on telegram, by recognising pixels and using pyautogui module

Lumberjack Automated: @gamebot According to the official documentation, @gamebot is a demo bot for the Telegram Gaming Platform.` It provides some sam

Yew Chong 1 Dec 07, 2021
Turn NY Times crosswords into Across Lite files

NYT Crossword to Puz A windows program to convert NY Times crosswords from the web to Across Lite compatible files. To run this, first download and de

31 Oct 11, 2022
Utility for generating randomizer datapacks for minecraft.

Minecraft Rando Utility for generating randomizer datapacks for minecraft. At the moment, it randomizes the following: Loot tables (including block dr

2 Dec 02, 2021
Algorithm to solve Wordle correctly 100% of the time within 6 attempts.

WordleSolver © Zulkarnine, 2022. Algorithm to solve Wordle 100% of the time within 6 attempts. You can go ahead and run main.py to run it for all 2315

Zulkarnine Mahmud 69 Dec 11, 2022
A Pygame game made in 48 hours

Flappuccino Flappuccino is a game created in 48 hours for the PyGame Community New Years Jam using Python with Pygame. Screenshots Background Informat

PolyMars 242 Jan 02, 2023
Easy and fun game to play a bit. Written in python

NumGuesser Easy and fun game to play a bit. Written in python

Lodi#0001 4 May 22, 2022
Projeto Flappy Bird temática doom, projeto python e pygame

Doom-Bird Tecnologias usadas Requisitos para inicializar o jogo: Python faça o download em: https://www.python.org/downloads/ Após instalar o Python d

João Guilherme 1 Dec 08, 2021