๐Ÿš— INGI Dakar 2K21 - Be the first one on the finish line ! ๐Ÿš—

Overview

๐Ÿš— INGI Dakar 2K21 - Be the first one on the finish line ! ๐Ÿš—

This year's first semester Club Info challenge will put you at the head of a car racing team. You will participate to the world's most famous racing contest, the INGI Dakar. Your goal is to build the best car, and to beat your opponents by reaching the furthest distance from the starting line.

Challenge statement

Each group will be put in charge of a car racing team. Ultimately, your goal is to reach the furthest distance from the starting line, with any of your cars. For this, you will have 6 generations of 20 cars. Each generation will be produced based on the previous one. Your job is thus to implement the algorithm that takes the previous generation of cars in argument, and that produces the next generation. Such an algorithm is called a genetic algorithm, for which a theoretical background is given hereafter.

Genetic algorithms

Genetic algorithms (GA) are inspired by the process of natural selection. They are used to resolve complex problems. They use operators such as mutation, crossover and selection. GA process is split into generations. Each generation is composed of a finite number of individuals which are built from the best individuals of the last generation and one or several operators. The first generation is generally randomly created.

Genetic algorithms are used for a large variety of problems from antenna shape optimization to minimize the weight of structures embarked in mars rovers.

A genetic algorithm is based on three operators:

  • Mutation, a mutation is a random modification of a parameter for an individual in the generation,
  • Crossover, a crossover is the creation of an individual based on parameters values from several members of the last generation,
  • Selection, in a genetic algorithm, we select the best individuals of a generation to construct the next generation.

Alternative text describing the image

The Mutation operator is used to ensure that the selection is not trapped in a local optima and can not reach the global optima for each parameters.

Some useful links:

Program specifications

The program for the INGI Dakar 2K21 is composed of 7 Python modules:

  • Car.py: Defines the class Car that represents a car of the game. A Car is composed of two Wheels and a Chassis, where the Wheels are located on two of the four Chassis vertices.
  • Chassis.py: Defines the class Chassis that represents a car chassis. A Chassis is represented by four vertices connected with each other in a quadrilateral shape.
  • CustomFormatter.py: Used for logging purposes.
  • Game.py: Defines the class Game that represents a game of INGI Dakar 2K21, i.e. the simulation of the 6 generations of 20 cars.
  • main.py: Entry point of INGI Dakar 2K21, which launches the simulations and computes the score.
  • Terrain.py: Defines the class Terrain that represents the terrain on which the cars are driving.
  • Wheel.py: Defines the class Wheel that represents a car's wheel. A Wheel is defined by its radius and the fact that it is a motor wheel or not.

To participate to the challenge, you only have to modify the function next_generation in the module main.py, that takes a representation of the game world (a b2World object) and the previous generation of cars (a list of Car objects) as arguments, and that returns the next generation of cars (also a list of Car objects). The car features that you can update for the next generation are given below.

Car features

A car is composed of the following (the numbers in bold cannot be changed):

  • TWO wheels, one of which is a motor wheel
  • A chassis, composed by FOUR vertices, linked together to form a polygon shape.

The car features that you can modify to reach the maximum distance are the following:

  • Radius of the two wheels, separately.
  • Which wheel is the motor wheel.
  • Position of the four vertices of the chassis.
  • To which of the chassis' vertices the two wheels are attached.

Please consult the corresponding classes to understand how those features are expressed and used in the program.

Score computation

To start the simulation of the challenge, just run the python3 main.py Python module. You must also activate the python virtual environment with source penv/bin/activate.

The execution of the challenge, and computation of your final score, is as follows:

  • Each generation contains 20 cars. The maximum distance reached by any of the cars is recorded as the score of this generation.
  • A game is composed of 6 generations. The score of a game is the maximum score among all the generations.
  • To smoothen the results, 5 games are launched after each other. Your final score is the average of the different score you obtained during the games.

At the end of the 5 games, a plot will be shown, with your results for the 5 games.

Installation and execution

Installation

To install the project, first clone the repository with the following command:

git clone https://github.com/ClubINFO-INGI-UCLouvain/INGI-Dakar-2K21-Challenge.git

Then, install the needed libraries by running the install.sh script, inside the project directory:

python3 -m venv penv;
source  penv/bin/activate;
chmod +x install.sh;
./install.sh;

Execution

To run the challenge simulation, you can simply run the main.py Python module in the src directory, with the following command:

cd src/
python3 main.py [--seed_terrain SEED] [--seed_car SEED] [--no_UI] [--no_plot]

The command line arguments, all optional, are the following:

  • --seed_terrain SEED (with SEED an integer): sets the seed for the random generation of the game terrain to SEED, for reproducibility of the simulations
  • --seed_car SEED (with SEED an integer): sets the seed for the random generation of the first generation of cars to SEED, for reproducibility of the simulations
  • --no_UI: does not show the graphical interface of the game, which drastically speeds up the simulations
  • --no_plot: does not show the plot of the games' result at the end of all the games

Note that, for the contest, the seeds will be fixed for equity among the groups.

There is also a hidden argument, maybe you can try to find it ๐Ÿ˜‰

Owner
ClubINFO INGI (UCLouvain)
ClubINFO INGI (UCLouvain)
InvTorch: memory-efficient models with invertible functions

InvTorch: Memory-Efficient Invertible Functions This module extends the functionality of torch.utils.checkpoint.checkpoint to work with invertible fun

Modar M. Alfadly 12 May 12, 2022
CTF challenges from redpwnCTF 2021

redpwnCTF 2021 Challenges This repository contains challenges from redpwnCTF 2021 in the rCDS format; challenge information is in the challenge.yaml f

redpwn 27 Dec 07, 2022
PyTorch Code for NeurIPS 2021 paper Anti-Backdoor Learning: Training Clean Models on Poisoned Data.

Anti-Backdoor Learning PyTorch Code for NeurIPS 2021 paper Anti-Backdoor Learning: Training Clean Models on Poisoned Data. The Anti-Backdoor Learning

Yige-Li 51 Dec 07, 2022
Unofficial implementation of Google "CutPaste: Self-Supervised Learning for Anomaly Detection and Localization" in PyTorch

CutPaste CutPaste: image from paper Unofficial implementation of Google's "CutPaste: Self-Supervised Learning for Anomaly Detection and Localization"

Lilit Yolyan 59 Nov 27, 2022
Implement the Pareto Optimizer and pcgrad to make a self-adaptive loss for multi-task

multi-task_losses_optimizer Implement the Pareto Optimizer and pcgrad to make a self-adaptive loss for multi-task ๅทฒ็ปๅฎž้ชŒ่ฟ‡ไบ†๏ผŒไธไผšๆœ‰cuda out of memoryๆƒ…ๅ†ต ##Par

14 Dec 25, 2022
MT-GAN-PyTorch - PyTorch Implementation of Learning to Transfer: Unsupervised Domain Translation via Meta-Learning

MT-GAN-PyTorch PyTorch Implementation of AAAI-2020 Paper "Learning to Transfer: Unsupervised Domain Translation via Meta-Learning" Dependency: Python

29 Oct 19, 2022
KIDA: Knowledge Inheritance in Data Aggregation

KIDA: Knowledge Inheritance in Data Aggregation This project releases our 1st place solution on NeurIPS2021 ML4CO Dual Task. Slide and model weights a

24 Sep 08, 2022
Language models are open knowledge graphs ( non official implementation )

language-models-are-knowledge-graphs-pytorch Language models are open knowledge graphs ( work in progress ) A non official reimplementation of Languag

theblackcat102 132 Dec 18, 2022
RIFE - Real-Time Intermediate Flow Estimation for Video Frame Interpolation

RIFE - Real-Time Intermediate Flow Estimation for Video Frame Interpolation YouTube | BiliBili 16X interpolation results from two input images: Introd

ๆ—ท่ง†ๅคฉๅ…ƒ MegEngine 28 Dec 09, 2022
Tensorflow implementation of "Learning Deconvolution Network for Semantic Segmentation"

Tensorflow implementation of Learning Deconvolution Network for Semantic Segmentation. Install Instructions Works with tensorflow 1.11.0 and uses the

Fabian Bormann 224 Apr 15, 2022
Tacotron 2 - PyTorch implementation with faster-than-realtime inference

Tacotron 2 (without wavenet) PyTorch implementation of Natural TTS Synthesis By Conditioning Wavenet On Mel Spectrogram Predictions. This implementati

NVIDIA Corporation 4.1k Jan 03, 2023
Python lib to talk to pylontech lithium batteries (US2000, US3000, ...) using RS485

python-pylontech Python lib to talk to pylontech lithium batteries (US2000, US3000, ...) using RS485 What is this lib ? This lib is meant to talk to P

Frank 26 Dec 28, 2022
COVINS -- A Framework for Collaborative Visual-Inertial SLAM and Multi-Agent 3D Mapping

COVINS -- A Framework for Collaborative Visual-Inertial SLAM and Multi-Agent 3D Mapping Version 1.0 COVINS is an accurate, scalable, and versatile vis

ETHZ V4RL 183 Dec 27, 2022
StarGAN2 for practice

StarGAN2 for practice This version of StarGAN2 (coined as 'Post-modern Style Transfer') is intended mostly for fellow artists, who rarely look at scie

vadim epstein 87 Sep 24, 2022
HarDNeXt: Official HarDNeXt repository

HarDNeXt-Pytorch HarDNeXt: A Stage Receptive Field and Connectivity Aware Convolution Neural Network HarDNeXt-MSEG for Medical Image Segmentation in 0

5 May 26, 2022
Pytorch Implementations of large number classical backbone CNNs, data enhancement, torch loss, attention, visualization and some common algorithms.

Torch-template-for-deep-learning Pytorch implementations of some **classical backbone CNNs, data enhancement, torch loss, attention, visualization and

Li Shengyan 270 Dec 31, 2022
Lightweight stereo matching network based on MobileNetV1 and MobileNetV2

MobileStereoNet: Towards Lightweight Deep Networks for Stereo Matching

Cognitive Systems Research Group 139 Nov 30, 2022
MXNet implementation for: Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks with Octave Convolution

Octave Convolution MXNet implementation for: Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks with Octave Convolution Imag

Meta Research 549 Dec 28, 2022
The Rich Get Richer: Disparate Impact of Semi-Supervised Learning

The Rich Get Richer: Disparate Impact of Semi-Supervised Learning Preprocess file of the dataset used in implicit sub-populations: (Demographic groups

<a href=[email protected]"> 4 Oct 14, 2022
Miscellaneous and lightweight network tools

Network Tools Collection of miscellaneous and lightweight network tools to simplify daily operations, administration, and troubleshooting of networks.

Nicholas Russo 22 Mar 22, 2022