Isaac Gym Environments for Legged Robots

Related tags

Hardwarelegged_gym
Overview

Isaac Gym Environments for Legged Robots

This repository provides the environment used to train ANYmal (and other robots) to walk on rough terrain using NVIDIA's Isaac Gym. It includes all components needed for sim-to-real transfer: actuator network, friction & mass randomization, noisy observations and random pushes during training.
Maintainer: Nikita Rudin
Affiliation: Robotic Systems Lab, ETH Zurich
Contact: [email protected]

Useful Links

Project website: https://leggedrobotics.github.io/legged_gym/ Paper: https://arxiv.org/abs/2109.11978

Installation

  1. Create a new python virtual env with python 3.6, 3.7 or 3.8 (3.8 recommended)
  2. Install pytorch 1.10 with cuda-11.3:
    • pip3 install torch==1.10.0+cu113 torchvision==0.11.1+cu113 torchaudio==0.10.0+cu113 -f https://download.pytorch.org/whl/cu113/torch_stable.html
  3. Install Isaac Gym
    • Download and install Isaac Gym Preview 3 (Preview 2 will not work!) from https://developer.nvidia.com/isaac-gym
    • cd isaacgym_lib/python && pip install -e .
    • Try running an example python examples/1080_balls_of_solitude.py
    • For troubleshooting check docs isaacgym/docs/index.html)
  4. Install rsl_rl (PPO implementation)
  5. Install legged_gym
    • Clone this repository
    • cd legged_gym && git checkout develop && pip install -e .

CODE STRUCTURE

  1. Each environment is defined by an env file (legged_robot.py) and a config file (legged_robot_config.py). The config file contains two classes: one conatianing all the environment parameters (LeggedRobotCfg) and one for the training parameters (LeggedRobotCfgPPo).
  2. Both env and config classes use inheritance.
  3. Each non-zero reward scale specified in cfg will add a function with a corresponding name to the list of elements which will be summed to get the total reward.
  4. Tasks must be registered using task_registry.register(name, EnvClass, EnvConfig, TrainConfig). This is done in envs/__init__.py, but can also be done from outside of this repository.

Usage

  1. Train:
    python issacgym_anymal/scripts/train.py --task=anymal_c_flat
    • To run on CPU add following arguments: --sim_device=cpu, --rl_device=cpu (sim on CPU and rl on GPU is possible).
    • To run headless (no rendering) add --headless.
    • Important: To improve performance, once the training starts press v to stop the rendering. You can then enable it later to check the progress.
    • The trained policy is saved in issacgym_anymal/logs/ / _ /model_ .pt . Where and are defined in the train config.
    • The following command line arguments override the values set in the config files:
    • --task TASK: Task name.
    • --resume: Resume training from a checkpoint
    • --experiment_name EXPERIMENT_NAME: Name of the experiment to run or load.
    • --run_name RUN_NAME: Name of the run.
    • --load_run LOAD_RUN: Name of the run to load when resume=True. If -1: will load the last run.
    • --checkpoint CHECKPOINT: Saved model checkpoint number. If -1: will load the last checkpoint.
    • --num_envs NUM_ENVS: Number of environments to create.
    • --seed SEED: Random seed.
    • --max_iterations MAX_ITERATIONS: Maximum number of training iterations.
  2. Play a trained policy:
    python issacgym_anymal/scripts/play.py --task=anymal_c_flat
    • By default the loaded policy is the last model of the last run of the experiment folder.
    • Other runs/model iteration can be selected by setting load_run and checkpoint in the train config.

Adding a new environment

The base environment legged_robot implements a rough terrain locomotion task. The corresponding cfg does not specify a robot asset (URDF/ MJCF) and no reward scales.

  1. Add a new folder to envs/ with ' _config.py , which inherit from an existing environment cfgs
  2. If adding a new robot:
    • Add the corresponding assets to resourses/.
    • In cfg set the asset path, define body names, default_joint_positions and PD gains. Specify the desired train_cfg and the name of the environment (python class).
    • In train_cfg set experiment_name and run_name
  3. (If needed) implement your environment in .py, inherit from an existing environment, overwrite the desired functions and/or add your reward functions.
  4. Register your env in isaacgym_anymal/envs/__init__.py.
  5. Modify/Tune other parameters in your cfg, cfg_train as needed. To remove a reward set its scale to zero. Do not modify parameters of other envs!

Troubleshooting

  1. If you get the following error: ImportError: libpython3.8m.so.1.0: cannot open shared object file: No such file or directory, do: sudo apt install libpython3.8

Known Issues

  1. The contact forces reported by net_contact_force_tensor are unreliable when simulating on GPU with a triangle mesh terrain. A workaround is to use force sensors, but the force are propagated through the sensors of consecutive bodies resulting in an undesireable behaviour. However, for a legged robot it is possible to add sensors to the feet/end effector only and get the expected results. When using the force sensors make sure to exclude gravity from trhe reported forces with sensor_options.enable_forward_dynamics_forces. Example:
    sensor_pose = gymapi.Transform()
    for name in feet_names:
        sensor_options = gymapi.ForceSensorProperties()
        sensor_options.enable_forward_dynamics_forces = False # for example gravity
        sensor_options.enable_constraint_solver_forces = True # for example contacts
        sensor_options.use_world_frame = True # report forces in world frame (easier to get vertical components)
        index = self.gym.find_asset_rigid_body_index(robot_asset, name)
        self.gym.create_asset_force_sensor(robot_asset, index, sensor_pose, sensor_options)
    (...)

    sensor_tensor = self.gym.acquire_force_sensor_tensor(self.sim)
    self.gym.refresh_force_sensor_tensor(self.sim)
    force_sensor_readings = gymtorch.wrap_tensor(sensor_tensor)
    self.sensor_forces = force_sensor_readings.view(self.num_envs, 4, 6)[..., :3]
    (...)

    self.gym.refresh_force_sensor_tensor(self.sim)
    contact = self.sensor_forces[:, :, 2] > 1.
Owner
Robotic Systems Lab - Legged Robotics at ETH Zürich
The Robotic Systems Lab investigates the development of machines and their intelligence to operate in rough and challenging environments.
Robotic Systems Lab - Legged Robotics at ETH Zürich
Create (templateable) cameras that display qr codes in homeassistant

QRCam This custom component creates cameras displaying qrcodes. The QRCodes can be static or generated from templates. If you use a template as conten

Jannes Müller 5 Oct 06, 2022
Isaac Gym Environments for Legged Robots

Isaac Gym Environments for Legged Robots This repository provides the environment used to train ANYmal (and other robots) to walk on rough terrain usi

Robotic Systems Lab - Legged Robotics at ETH Zürich 372 Jan 08, 2023
ENC28J60 Ethernet chip driver for MicroPython (RP2)

micropy-ENC28J60 ENC28J60 Ethernet chip driver for MicroPython v1.17 (RP2) Rationale ENC28J60 is a popular and cheap module for DIY projects. At the m

11 Nov 16, 2022
A set of postprocessing scripts and macro to accelerate the gyroid infill print speed with Klipper

A set of postprocessing scripts and macro to accelerate the gyroid infill print speed with Klipper

Jérôme W. 75 Jan 07, 2023
LED effects plugin for klipper

This plugin allows Klipper to run effects and animations on addressable LEDs, such as Neopixels, WS2812 or SK6812.

Julian Schill 238 Jan 04, 2023
Home-Assistant MQTT bridge for Panasonic Comfort Cloud

Panasonic Comfort Cloud MQTT Bridge Home-Assistant MQTT bridge for Panasonic Comfort Cloud. Note: Currently this brige is a one evening prototype proj

Santtu Järvi 2 Jan 04, 2023
View your medication from Medisafe Cloud in Home Assistant

Medisafe View your medication from Medisafe Cloud in Home Assistant. This integration adds sensors for today's upcoming, taken, skipped, and missed do

Sam Steele 12 Dec 27, 2022
Simple Python script to decode and verify an European Health Certificate QR-code

A simple Python script to decode and verify an European Health Certificate QR-code.

Mathias Panzenböck 61 Oct 05, 2022
This is the remake of the program PYOBD. It works on Python3 and all new libraries. It was tested on Linux, Windows, and it should work on MAC too.

This is the remake of the program PYOBD. It works on Python3 and all new libraries. It was tested on Linux, Windows, and it should work on MAC too. You just need an ELM327 USB or bluetooth device and

127 Jan 06, 2023
Real-time Coastal Monitoring at the University of Hawaii at Manoa

Coastal Monitoring at the University of Manoa Source code for Beaglebone/RPi-based data loggers, shore internet gateways, and web server. Software dev

Stanley Lio 7 Dec 07, 2021
Home Assistant custom integration for Yi cameras: yi-hack-MStar, yi-hack-Allwinner and yi-hack-Allwinner-v2

yi-hack Home Assistant integration Overview yi-hack Home Assistant is a custom integration for Yi cameras (or Sonoff camera) with one of the following

roleo 131 Jan 03, 2023
Ha-rpi gpio - Home Assistant Raspberry Pi GPIO Integration

Home Assistant Raspberry Pi GPIO custom integration This is a spin-off from the

Shay Levy 98 Dec 24, 2022
Plug and Play on Internet of Things with LoRa wireless modulation.

IoT-PnP Plug and Play on Internet of Things with LoRa wireless modulation. Device Side In the '505_PnP' folder has a modified ardunino template code s

Lambert Yang 1 May 19, 2022
Samples for robotics, node, python, and bash

RaspberryPi Robot Project Technologies: Render: intent Currently designed to act as programmable sentry.

Martin George 1 May 31, 2022
SALUS THERMOSTAT Custom component for Home-Assistant

Home-Assistant Custom Components Custom Components for Home-Assistant (http://www.home-assistant.io) Salus Thermostat Climate Component My device is R

21 Dec 18, 2022
NYCT-GTFS - Real-time NYC subway data parsing for humans

NYCT-GTFS - Real-time NYC subway data parsing for humans This python library provides a human-friendly, native python interface for dealing with the N

Andrew Dickinson 37 Dec 27, 2022
a fork of the OnionShare software better optimized for lower spec lightweight machines and ARM processors

OnionShare-Optimized A fork of the OnionShare software better optimized for lower spec lightweight machines and ARM processors such as Raspberry Pi or

ALTPORT 4 Aug 05, 2021
EuroPi: A reprogrammable Eurorack project based on the Raspberry Pi Pico

EuroPi The EuroPi is a fully user reprogrammable module based on the Raspberry Pi Pico, which allows users to process inputs and controls to produce o

Allen Synthesis 218 Jan 01, 2023
Python library to interact with the GCE Electronics IPX800 device

A python library to control a GCE-Electronics IPX800 V4 device through its API.

Marc-Aurèle Brothier 2 Oct 20, 2021
A module for cross-platform control of the mouse and keyboard in python that is simple to install and use.

PyUserInput PyUserInput is a group project so we've moved the project over to a group organization: https://github.com/PyUserInput/PyUserInput . That

Paul Barton 1k Dec 27, 2022