Spectrum is an AI that uses machine learning to generate Rap song lyrics

Overview

Contributors Forks Stargazers Issues MIT License Open In Colab


Spectrum

Spectrum is an AI that uses deep learning to generate rap song lyrics.

View Demo
Report Bug
Request Feature
Open In Colab

About The Project

Spectrum is an AI that uses deep learning to generate rap song lyrics.

Built With

This project is built using Python, Tensorflow, and Flask.

Getting Started

Installation

# clone the repo
git clone https://github.com/YigitGunduc/Spectrum.git

# install requirements
pip install -r requirements.txt

Training

# navigate to the Spectrum/AI folder 
cd Spectrum/AI

# pass verbose, epochs, save_at arguments and run train.py 
python3 train.py -h, --help  --epochs EPOCHS --save_at SAVE_AT --verbose VERBOSE --rnn_neurons RNN_NEURONS
             --embed_dim EMBED_DIM --dropout DROPOUT --num_layers NUM_LAYERS --learning_rate LEARNING_RATE

All the arguments are optional if you leave them empty model will construct itself with the default params

Generating Text from Trained Model

Call eval.py from the command line with seed text as an argument

python3 eval.py --seed SEEDTEXT

or

from model import Generator

model = Generator()

model.load_weights('../models/model-5-epochs-256-neurons.h5')

generatedText = model.predict(start_seed=SEED, gen_size=1000)

print(generatedText)
  • If you have tweaked the model's parameters while training initialize the model with the parameters you trained

Running the Web-App Locally

# navigate to the Spectrum folder 
cd Spectrum

# run app.py
python3 app.py

# check out http://0.0.0.0:8080

API

spectrum has a free web API you can send request to it as shown below

import requests 

response = requests.get("https://spectrumapp.herokuapp.com/api/generate/SEEDTEXT")
#raw response
print(response.json())
#cleaned up response
print(response.json()["lyrics"])

Hyperparameters

epochs = 30 
batch size = 128
number of layers = 2(hidden) + 1(output)
number of RNN units = 256
dropout prob = 0.3
embedding dimensions = 64
optimizer = Adam
loss = sparse categorical crossentropy

These hyperparameters are the best that I can found but you have to be careful while dealing with the hyperparameters because this model can over or underfit quite easily and GRUs performs better than LSTMs

Info about model

>>> from model import Generator
>>> model = Generator()
>>> model.load_weights('../models/model-5-epochs-256-neurons.h5')
>>> model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding (Embedding)        (1, None, 64)             6400      
_________________________________________________________________
gru (GRU)                    (1, None, 256)            247296    
_________________________________________________________________
gru_1 (GRU)                  (1, None, 256)            394752    
_________________________________________________________________
dense (Dense)                (1, None, 100)            25700     
=================================================================
Total params: 674,148
Trainable params: 674,148
Non-trainable params: 0
_________________________________________________________________

>>> model.hyperparams()
Hyper Parameters
+--------------------------+
|rnn_neurons   |        256|
|embed_dim     |         64|
|learning_rate |     0.0001|
|dropout       |        0.3|
|num_layers    |          2|
+--------------------------+
>>>

Roadmap

See the open issues for a list of proposed features (and known issues).

Results

WARNING: There is some offensive language ahead, please stop reading here if you are a sensitive person. The texts below have been generated by Spectrum

Seed : today

Prediction : 

If that don't, yeah
Weint off the music
It's like a fired-enother foar fool straight for an exactly
Nigga why I id my Door Merican muthafucka

Ng answered by need for blazy hard
The family wish fans dishes rolled up
How better just wanna die
Match all about the moment in I glory
Fire is that attention is the flop and pipe those peokin' distriors
Bitch I been hard and I'm like the Scales me and we're going to school like all-off of the allegit to get the bitches
Yeah kinda too legit back into highin'
A year have it would plobably want

And we all bustin' the conscious in the cusfuckers won't ha
Quite warkie and it's blow, and what? I cannot love him,
Alugal Superman, and the revolution likes migh
I ain't still not I uest the neighborhoo
Powers all too bad show, you crite your bac
When I say way too fathom
If you wanna revell, money, where your face we'll blin
Pulf me very, yo, they pull out for taught nothin' off
I pass a with a nigga hang some, pleas
Fuck me now, it's a

======================================================================
Seed : hello

Prediction : 

hellow motherfucker
You wanna talk on the pockets on Harlotom
I'm legit some more than Volumon
Ridicalab knowledge is blessin' some of your honierby man
We just bust the Flud joke with shoulders on the Statue
Lecock it on everybody want your dices to speak
While she speak cents look back to Pops
He was a nigga when I got behind pictures any Lil Sanvanas
Used to in her lady yaught they never had a bitch
He'll break the jird little rappers kill your children is

I'm prayin' back to ready for that bitch just finished And mised to the gamr
Every eyes on and about that getting common
I'm going to attractived with its
I just went by the crowd get the promise to buy the money-a star big down
Can one sall 'em in me tryna get them days that's how I can break the top
Well, that's hug her hands he screaming like a fucking hip-hop but put a Blidze like rhymin'
Yeah I slack like a Job let your cops got a generres
These West of it today flamping this
Black Kuttle crib, said "Ju Conlie, hold up, fuck the

======================================================================
Seed : bestfriend

Prediction : 

bestfriend
Too much time we tonight
The way I know is a please have no self-back when I be for the fucking weed and a game
What the fuck we wanna be working on the streets make it like a stay down the world is from the head of the real brain
Chain don't come back to the grass
My dick is the one to tell you I'm the fuck
So see me we gon' be fans when you had to hear the window you come to the dick when a little cooleng and I was calling what the fuck is it good as the crown
And I'm representing you finally waitin' in your girl
This is the corner with my brother
I'm just a damn door and the real motherfuckers come got the point my shit is the money on the world

I get it then the conscious that's why I cripp
I might take my own shit so let me have a bad bitch
I'm just had and make the fuck is in the single of the window
I think I ain't got the world is all my gone be mine
They ain't like the half the best between my words
And I'm changing with the heads of the speech
Fuck a bunch of best of a fuck

Contributing

Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the MIT License. See LICENSE for more information.

Free course that takes you from zero to Reinforcement Learning PRO 🦸🏻‍🦸🏽

The Hands-on Reinforcement Learning course 🚀 From zero to HERO 🦸🏻‍🦸🏽 Out of intense complexities, intense simplicities emerge. -- Winston Churchi

Pau Labarta Bajo 260 Dec 28, 2022
Official implementation of NeurIPS'2021 paper TransformerFusion

TransformerFusion: Monocular RGB Scene Reconstruction using Transformers Project Page | Paper | Video TransformerFusion: Monocular RGB Scene Reconstru

Aljaz Bozic 118 Dec 25, 2022
Uncertainty Estimation via Response Scaling for Pseudo-mask Noise Mitigation in Weakly-supervised Semantic Segmentation

Uncertainty Estimation via Response Scaling for Pseudo-mask Noise Mitigation in Weakly-supervised Semantic Segmentation Introduction This is a PyTorch

XMed-Lab 30 Sep 23, 2022
Dewarping Document Image By Displacement Flow Estimation with Fully Convolutional Network.

Dewarping Document Image By Displacement Flow Estimation with Fully Convolutional Network

111 Dec 27, 2022
Adversarial Learning for Modeling Human Motion

Adversarial Learning for Modeling Human Motion This repository contains the open source code which reproduces the results for the paper: Adversarial l

wangqi 6 Jun 15, 2021
The implementation of ICASSP 2020 paper "Pixel-level self-paced learning for super-resolution"

Pixel-level Self-Paced Learning for Super-Resolution This is an official implementaion of the paper Pixel-level Self-Paced Learning for Super-Resoluti

Elon Lin 41 Dec 15, 2022
TyXe: Pyro-based BNNs for Pytorch users

TyXe: Pyro-based BNNs for Pytorch users TyXe aims to simplify the process of turning Pytorch neural networks into Bayesian neural networks by leveragi

87 Jan 03, 2023
A scikit-learn-compatible module for estimating prediction intervals.

|Anaconda|_ MAPIE - Model Agnostic Prediction Interval Estimator MAPIE allows you to easily estimate prediction intervals using your favourite sklearn

SimAI 584 Dec 27, 2022
PyJokes - Joking around with Python library pyjokes

Hi, it's Muhaimin again 👋 This is something unorthodox but cool. Don't forget t

Muhaimin A. Salay Kanton 1 Feb 02, 2022
Self-Supervised depth kalilia

Self-Supervised depth kalilia

24 Oct 15, 2022
Ground truth data for the Optical Character Recognition of Historical Classical Commentaries.

OCR Ground Truth for Historical Commentaries The dataset OCR ground truth for historical commentaries (GT4HistComment) was created from the public dom

Ajax Multi-Commentary 3 Sep 08, 2022
Contains code for Deep Kernelized Dense Geometric Matching

DKM - Deep Kernelized Dense Geometric Matching Contains code for Deep Kernelized Dense Geometric Matching We provide pretrained models and code for ev

Johan Edstedt 83 Dec 23, 2022
Repositório da disciplina de APC, no segundo semestre de 2021

NOTAS FINAIS: https://github.com/fabiommendes/apc2018/blob/master/nota-final.pdf Algoritmos e Programação de Computadores Este é o Git da disciplina A

16 Dec 16, 2022
Deep Learning for Natural Language Processing SS 2021 (TU Darmstadt)

Deep Learning for Natural Language Processing SS 2021 (TU Darmstadt) Task Training huge unsupervised deep neural networks yields to strong progress in

Oliver Hahn 1 Jan 26, 2022
Small little script to scrape, parse and check for active tor nodes. Can be used as proxies.

TorScrape TorScrape is a small but useful script made in python that scrapes a website for active tor nodes, parse the html and then save the nodes in

5 Dec 04, 2022
A simple algorithm for extracting tree height in sparse scene from point cloud data.

TREE HEIGHT EXTRACTION IN SPARSE SCENES BASED ON UAV REMOTE SENSING This is the offical python implementation of the paper "Tree Height Extraction in

6 Oct 28, 2022
Weighing Counts: Sequential Crowd Counting by Reinforcement Learning

LibraNet This repository includes the official implementation of LibraNet for crowd counting, presented in our paper: Weighing Counts: Sequential Crow

Hao Lu 18 Nov 05, 2022
Unsupervised MRI Reconstruction via Zero-Shot Learned Adversarial Transformers

Official TensorFlow implementation of the unsupervised reconstruction model using zero-Shot Learned Adversarial TransformERs (SLATER). (https://arxiv.

ICON Lab 22 Dec 22, 2022
Code release for "Making a Bird AI Expert Work for You and Me".

Making-a-Bird-AI-Expert-Work-for-You-and-Me Code release for "Making a Bird AI Expert Work for You and Me". arxiv (Coming soon...) Changelog 2021/12/6

PRIS-CV: Computer Vision Group 11 Dec 11, 2022
Implements pytorch code for the Accelerated SGD algorithm.

AccSGD This is the code associated with Accelerated SGD algorithm used in the paper On the insufficiency of existing momentum schemes for Stochastic O

205 Jan 02, 2023