An interpreter for RASP as described in the ICML 2021 paper "Thinking Like Transformers"

Related tags

Deep LearningRASP
Overview

RASP

Setup

Mac or Linux

Run ./setup.sh . It will create a python3 virtual environment and install the dependencies for RASP. It will also try to install graphviz (the non-python part) and rlwrap on your machine. If these fail, you will still be able to use RASP, however: the interface will not be as nice without rlwrap, and drawing s-op computation flows will not be possible without graphviz. After having set up, you can run ./rasp.sh to start the RASP read-evaluate-print-loop.

Windows

Follow the instructions given in windows instructions.txt

The REPL

After having set up, if you are in mac/linux, you can run ./rasp.sh to start the RASP REPL. Otherwise, run python3 RASP_support/REPL.py Use Ctrl+C to quit a partially entered command, and Ctrl+D to exit the REPL.

Initial Environment

RASP starts with the base s-ops: tokens, indices, and length. It also has the base functions select, aggregate, and selector_width as described in the paper, a selector full_s created through select(1,1,==) that creates a "full" attention pattern, and several other library functions (check out RASP_support/rasplib.rasp to see them).

Additionally, the REPL begins with a base example, "hello", on which it shows the output for each created s-op or selector. This example can be changed, and toggled on and off, through commands to the REPL.

All RASP commands end with a semicolon. Commands to the REPL -- such as changing the base example -- do not.

Start by following along with the examples -- they are kept at the bottom of this readme.

Note on input types:

RASP expects inputs in four forms: strings, integers, floats, or booleans, handled respectively by tokens_str, tokens_int, tokens_float, and tokens_bool. Initially, RASP loads with tokens set to tokens_str, this can be changed by assignment, e.g.: tokens=tokens_int;. When changing the input type, you will also want to change the base example, e.g.: set example [0,1,2].

Note that assignments do not retroactively change the computation trees of existing s-ops!

Writing and Loading RASP files

To keep and load RASP code from files, save them with .rasp as the extension, and use the 'load' command without the extension. For example, you can load the examples file paper_examples.rasp in this repository to the REPL as follows:

>> load "paper_examples";

This will make (almost) all values in the file available in the loading environment (whether the REPL, or a different .rasp file): values whose names begin with an underscore remain private to the file they are written in. Loading files in the REPL will also print a list of all loaded values.

Syntax Highlighting

For the Sublime Text editor, you can get syntax highlighting for .rasp files as follows:

  1. Install package control for sublime (you might already have it: look in the menu [Sublime Text]->[Preferences] and see if it's there. If not, follow the instructions at https://packagecontrol.io/installation).
  2. Install the 'packagedev' package through package control ([Sublime Text]->[Preferences]->[Package Control], then type [install package], then [packagedev])
  3. After installing PackageDev, create a new syntax definition file through [Tools]->[Packages]->[Package Development]->[New Syntax Definition].
  4. Copy the contents of RASP_support/RASP.sublime-syntax into the new syntax definition file, and save it as RASP.sublime-syntax.

[Above is basically following the instructions in http://ilkinulas.github.io/programming/2016/02/05/sublime-text-syntax-highlighting.html , and then copying in the contents of the provided RASP.sublime-syntax file]

Examples

Play around in the REPL!

Try simple elementwise manipulations of s-ops:

>>  threexindices =3 * indices;
     s-op: threexindices
 	 Example: threexindices("hello") = [0, 3, 6, 9, 12] (ints)
>> indices+indices;
     s-op: out
 	 Example: out("hello") = [0, 2, 4, 6, 8] (ints)

Change the base example, and create a selector that focuses each position on all positions before it:

>> set example "hey"
>> prevs=select(indices,indices,<);
     selector: prevs
 	 Example:
 			     h e y
 			 h |      
 			 e | 1    
 			 y | 1 1  

Check the output of an s-op on your new base example:

>> threexindices;
     s-op: threexindices
 	 Example: threexindices("hey") = [0, 3, 6] (ints)

Or on specific inputs:

>> threexindices(["hi","there"]);
	 =  [0, 3] (ints)
>> threexindices("hiya");
	 =  [0, 3, 6, 9] (ints)

Aggregate with the full selection pattern (loaded automatically with the REPL) to compute the proportion of a letter in your input:

>> full_s;
     selector: full_s
 	 Example:
 			     h e y
 			 h | 1 1 1
 			 e | 1 1 1
 			 y | 1 1 1
>> my_frac=aggregate(full_s,indicator(tokens=="e"));
     s-op: my_frac
 	 Example: my_frac("hey") = [0.333]*3 (floats)

Note: when an s-op's output is identical in all positions, RASP simply prints the output of one position, followed by " * X" (where X is the sequence length) to mark the repetition.

Check if a letter is in your input at all:

>> "e" in tokens;
     s-op: out
 	 Example: out("hey") = [T]*3 (bools)

Alternately, in an elementwise fashion, check if each of your input tokens belongs to some group:

>> vowels = ["a","e","i","o","u"];
     list: vowels = ['a', 'e', 'i', 'o', 'u']
>> tokens in vowels;
     s-op: out
 	 Example: out("hey") = [F, T, F] (bools)

Draw the computation flow for an s-op you have created, on an input of your choice: (this will create a pdf in a subfolder comp_flows of the current directory)

>> draw(my_frac,"abcdeeee");
	 =  [0.5]*8 (floats)

Or simply on the base example:

>> draw(my_frac);
	 =  [0.333]*3 (floats)

If they bother you, turn the examples off, and bring them back when you need them:

>> examples off
>> indices;
     s-op: indices
>> full_s;
     selector: full_s
>> examples on
>> indices;
     s-op: indices
 	 Example: indices("hey") = [0, 1, 2] (ints)

You can also do this selectively, turning only selector or s-op examples on and off, e.g.: selector examples off.

Create a selector that focuses each position on all other positions containing the same token. But first, set the base example to "hello" for a better idea of what's happening:

>> set example "hello"
>> same_token=select(tokens,tokens,==);
     selector: same_token
 	 Example:
 			     h e l l o
 			 h | 1        
 			 e |   1      
 			 l |     1 1  
 			 l |     1 1  
 			 o |         1

Then, use selector_width to compute, for each position, how many other positions the selector same_token focuses it on. This effectively computes an in-place histogram over the input:

>> histogram=selector_width(same_token);
     s-op: histogram
 	 Example: histogram("hello") = [1, 1, 2, 2, 1] (ints)

For more complicated examples, check out paper_examples.rasp!

Experiments on Transformers

The transformers in the paper were trained, and their attention heatmaps visualised, using the code in this repository: https://github.com/tech-srl/RASP-exps

A package for music online and offline rhythmic information analysis including music Beat, downbeat, tempo and meter tracking.

BeatNet A package for music online and offline rhythmic information analysis including music Beat, downbeat, tempo and meter tracking. This repository

Mojtaba Heydari 157 Dec 27, 2022
This is a simple plugin for Vim that allows you to use OpenAI Codex.

🤖 Vim Codex An AI plugin that does the work for you. This is a simple plugin for Vim that will allow you to use OpenAI Codex. To use this plugin you

Tom Dörr 195 Dec 28, 2022
Learning Modified Indicator Functions for Surface Reconstruction

Learning Modified Indicator Functions for Surface Reconstruction In this work, we propose a learning-based approach for implicit surface reconstructio

4 Apr 18, 2022
AI-UPV at IberLEF-2021 DETOXIS task: Toxicity Detection in Immigration-Related Web News Comments Using Transformers and Statistical Models

AI-UPV at IberLEF-2021 DETOXIS task: Toxicity Detection in Immigration-Related Web News Comments Using Transformers and Statistical Models Description

Angel de Paula 0 Jun 08, 2022
Pytorch library for fast transformer implementations

Transformers are very successful models that achieve state of the art performance in many natural language tasks

Idiap Research Institute 1.3k Dec 30, 2022
My freqtrade strategies

My freqtrade-strategies Hi there! This is repo for my freqtrade-strategies. My name is Ilya Zelenchuk, I'm a lecturer at the SPbU university (https://

171 Dec 05, 2022
A web-based application for quick, scalable, and automated hyperparameter tuning and stacked ensembling in Python.

Xcessiv Xcessiv is a tool to help you create the biggest, craziest, and most excessive stacked ensembles you can think of. Stacked ensembles are simpl

Reiichiro Nakano 1.3k Nov 17, 2022
This is an example of object detection on Micro bacterium tuberculosis using Mask-RCNN

Mask-RCNN on Mycobacterium tuberculosis This is an example of object detection on Mycobacterium Tuberculosis using Mask RCNN. Implement of Mask R-CNN

Jun-En Ding 1 Sep 16, 2021
RoadMap and preparation material for Machine Learning and Data Science - From beginner to expert.

ML-and-DataScience-preparation This repository has the goal to create a learning and preparation roadMap for Machine Learning Engineers and Data Scien

33 Dec 29, 2022
A Python framework for developing parallelized Computational Fluid Dynamics software to solve the hyperbolic 2D Euler equations on distributed, multi-block structured grids.

pyHype: Computational Fluid Dynamics in Python pyHype is a Python framework for developing parallelized Computational Fluid Dynamics software to solve

Mohamed Khalil 21 Nov 22, 2022
Image process framework based on plugin like imagej, it is esay to glue with scipy.ndimage, scikit-image, opencv, simpleitk, mayavi...and any libraries based on numpy

Introduction ImagePy is an open source image processing framework written in Python. Its UI interface, image data structure and table data structure a

ImagePy 1.2k Dec 29, 2022
Binary Stochastic Neurons in PyTorch

Binary Stochastic Neurons in PyTorch http://r2rt.com/binary-stochastic-neurons-in-tensorflow.html https://github.com/pytorch/examples/tree/master/mnis

Onur Kaplan 54 Nov 21, 2022
Ranking Models in Unlabeled New Environments (iccv21)

Ranking Models in Unlabeled New Environments Prerequisites This code uses the following libraries Python 3.7 NumPy PyTorch 1.7.0 + torchivision 0.8.1

14 Dec 17, 2021
The 1st place solution of track2 (Vehicle Re-Identification) in the NVIDIA AI City Challenge at CVPR 2021 Workshop.

AICITY2021_Track2_DMT The 1st place solution of track2 (Vehicle Re-Identification) in the NVIDIA AI City Challenge at CVPR 2021 Workshop. Introduction

Hao Luo 91 Dec 21, 2022
Conformer: Local Features Coupling Global Representations for Visual Recognition

Conformer: Local Features Coupling Global Representations for Visual Recognition (arxiv) This repository is built upon DeiT and timm Usage First, inst

Zhiliang Peng 378 Jan 08, 2023
Autoregressive Models in PyTorch.

Autoregressive This repository contains all the necessary PyTorch code, tailored to my presentation, to train and generate data from WaveNet-like auto

Christoph Heindl 41 Oct 09, 2022
PyExplainer: A Local Rule-Based Model-Agnostic Technique (Explainable AI)

PyExplainer PyExplainer is a local rule-based model-agnostic technique for generating explanations (i.e., why a commit is predicted as defective) of J

AI Wizards for Software Management (AWSM) Research Group 14 Nov 13, 2022
This repository contains code to train and render Mixture of Volumetric Primitives (MVP) models

Mixture of Volumetric Primitives -- Training and Evaluation This repository contains code to train and render Mixture of Volumetric Primitives (MVP) m

Meta Research 125 Dec 29, 2022
A sketch extractor for anime/illustration.

Anime2Sketch Anime2Sketch: A sketch extractor for illustration, anime art, manga By Xiaoyu Xiang Updates 2021.5.2: Upload more example results of anim

Xiaoyu Xiang 1.6k Jan 01, 2023
Weighted QMIX: Expanding Monotonic Value Function Factorisation

This repo contains the cleaned-up code that was used in "Weighted QMIX: Expanding Monotonic Value Function Factorisation"

whirl 82 Dec 29, 2022