Isaac Gym Environments for Legged Robots

Related tags

Hardwarelegged_gym
Overview

Isaac Gym Environments for Legged Robots

This repository provides the environment used to train ANYmal (and other robots) to walk on rough terrain using NVIDIA's Isaac Gym. It includes all components needed for sim-to-real transfer: actuator network, friction & mass randomization, noisy observations and random pushes during training.
Maintainer: Nikita Rudin
Affiliation: Robotic Systems Lab, ETH Zurich
Contact: [email protected]

Useful Links

Project website: https://leggedrobotics.github.io/legged_gym/ Paper: https://arxiv.org/abs/2109.11978

Installation

  1. Create a new python virtual env with python 3.6, 3.7 or 3.8 (3.8 recommended)
  2. Install pytorch 1.10 with cuda-11.3:
    • pip3 install torch==1.10.0+cu113 torchvision==0.11.1+cu113 torchaudio==0.10.0+cu113 -f https://download.pytorch.org/whl/cu113/torch_stable.html
  3. Install Isaac Gym
    • Download and install Isaac Gym Preview 3 (Preview 2 will not work!) from https://developer.nvidia.com/isaac-gym
    • cd isaacgym_lib/python && pip install -e .
    • Try running an example python examples/1080_balls_of_solitude.py
    • For troubleshooting check docs isaacgym/docs/index.html)
  4. Install rsl_rl (PPO implementation)
  5. Install legged_gym
    • Clone this repository
    • cd legged_gym && git checkout develop && pip install -e .

CODE STRUCTURE

  1. Each environment is defined by an env file (legged_robot.py) and a config file (legged_robot_config.py). The config file contains two classes: one conatianing all the environment parameters (LeggedRobotCfg) and one for the training parameters (LeggedRobotCfgPPo).
  2. Both env and config classes use inheritance.
  3. Each non-zero reward scale specified in cfg will add a function with a corresponding name to the list of elements which will be summed to get the total reward.
  4. Tasks must be registered using task_registry.register(name, EnvClass, EnvConfig, TrainConfig). This is done in envs/__init__.py, but can also be done from outside of this repository.

Usage

  1. Train:
    python issacgym_anymal/scripts/train.py --task=anymal_c_flat
    • To run on CPU add following arguments: --sim_device=cpu, --rl_device=cpu (sim on CPU and rl on GPU is possible).
    • To run headless (no rendering) add --headless.
    • Important: To improve performance, once the training starts press v to stop the rendering. You can then enable it later to check the progress.
    • The trained policy is saved in issacgym_anymal/logs/ / _ /model_ .pt . Where and are defined in the train config.
    • The following command line arguments override the values set in the config files:
    • --task TASK: Task name.
    • --resume: Resume training from a checkpoint
    • --experiment_name EXPERIMENT_NAME: Name of the experiment to run or load.
    • --run_name RUN_NAME: Name of the run.
    • --load_run LOAD_RUN: Name of the run to load when resume=True. If -1: will load the last run.
    • --checkpoint CHECKPOINT: Saved model checkpoint number. If -1: will load the last checkpoint.
    • --num_envs NUM_ENVS: Number of environments to create.
    • --seed SEED: Random seed.
    • --max_iterations MAX_ITERATIONS: Maximum number of training iterations.
  2. Play a trained policy:
    python issacgym_anymal/scripts/play.py --task=anymal_c_flat
    • By default the loaded policy is the last model of the last run of the experiment folder.
    • Other runs/model iteration can be selected by setting load_run and checkpoint in the train config.

Adding a new environment

The base environment legged_robot implements a rough terrain locomotion task. The corresponding cfg does not specify a robot asset (URDF/ MJCF) and no reward scales.

  1. Add a new folder to envs/ with ' _config.py , which inherit from an existing environment cfgs
  2. If adding a new robot:
    • Add the corresponding assets to resourses/.
    • In cfg set the asset path, define body names, default_joint_positions and PD gains. Specify the desired train_cfg and the name of the environment (python class).
    • In train_cfg set experiment_name and run_name
  3. (If needed) implement your environment in .py, inherit from an existing environment, overwrite the desired functions and/or add your reward functions.
  4. Register your env in isaacgym_anymal/envs/__init__.py.
  5. Modify/Tune other parameters in your cfg, cfg_train as needed. To remove a reward set its scale to zero. Do not modify parameters of other envs!

Troubleshooting

  1. If you get the following error: ImportError: libpython3.8m.so.1.0: cannot open shared object file: No such file or directory, do: sudo apt install libpython3.8

Known Issues

  1. The contact forces reported by net_contact_force_tensor are unreliable when simulating on GPU with a triangle mesh terrain. A workaround is to use force sensors, but the force are propagated through the sensors of consecutive bodies resulting in an undesireable behaviour. However, for a legged robot it is possible to add sensors to the feet/end effector only and get the expected results. When using the force sensors make sure to exclude gravity from trhe reported forces with sensor_options.enable_forward_dynamics_forces. Example:
    sensor_pose = gymapi.Transform()
    for name in feet_names:
        sensor_options = gymapi.ForceSensorProperties()
        sensor_options.enable_forward_dynamics_forces = False # for example gravity
        sensor_options.enable_constraint_solver_forces = True # for example contacts
        sensor_options.use_world_frame = True # report forces in world frame (easier to get vertical components)
        index = self.gym.find_asset_rigid_body_index(robot_asset, name)
        self.gym.create_asset_force_sensor(robot_asset, index, sensor_pose, sensor_options)
    (...)

    sensor_tensor = self.gym.acquire_force_sensor_tensor(self.sim)
    self.gym.refresh_force_sensor_tensor(self.sim)
    force_sensor_readings = gymtorch.wrap_tensor(sensor_tensor)
    self.sensor_forces = force_sensor_readings.view(self.num_envs, 4, 6)[..., :3]
    (...)

    self.gym.refresh_force_sensor_tensor(self.sim)
    contact = self.sensor_forces[:, :, 2] > 1.
Owner
Robotic Systems Lab - Legged Robotics at ETH Zürich
The Robotic Systems Lab investigates the development of machines and their intelligence to operate in rough and challenging environments.
Robotic Systems Lab - Legged Robotics at ETH Zürich
Like htop (CPU and memory usage), but for your case LEDs. 😄

Like htop (CPU and memory usage), but for your case LEDs. 😄

Derek Anderson 3 Dec 08, 2021
Micropython automatic watering

micropython-automatic-watering micropython automatic watering his code was developed to be used with nodemcu esp8266, but can be modified to work with

1 Nov 24, 2021
DNP3 Stalker is a project to analyze and interact with DNP3 devices

DNP3 Stalker Purpose DNP3 Stalker is a project to analyze and interact with DNP3

Cutaway Security, LLC. 2 Feb 10, 2022
Hook and simulate global keyboard events on Windows and Linux.

keyboard Take full control of your keyboard with this small Python library. Hook global events, register hotkeys, simulate key presses and much more.

BoppreH 3.2k Dec 30, 2022
Sleep As Android integration for Home Assistant

Sleep As Android custom integration This integration will allow you to get events from your SleepAsAndroid application in a form of the sensor states

Igor 84 Dec 30, 2022
Custom component for interacting with Octopus Energy

Home Assistant Octopus Energy ** WARNING: This component is currently a work in progress ** Custom component built from the ground up to bring your Oc

David Kendall 116 Jan 02, 2023
Final-project-robokeeper created by GitHub Classroom

RoboKeeper! Jonny Bosnich, Joshua Cho, Lio Liang, Marco Morales, Cody Nichoson Demonstration Videos Grabbing the paddle: https://youtu.be/N0HPvFNHrTw

Cody Nichoson 1 Dec 12, 2021
Cascade Drone Swarm Physical Demonstration Project

Cascade Drone Swarm Physical Demonstration Project Table of Contents About The Project Built With Getting Started Prerequisites Installation About The

3 Aug 24, 2022
A simple non-official manager interface I'm using for my Raspberry Pis.

My Raspberry Pi Manager Overview I have two Raspberry Pi 4 Model B devices that I hooked up to my two TVs (one in my bedroom and the other in my new g

Christian Deacon 21 Jan 04, 2023
🏡 My Home Assistant Configs. Be sure to 🌟 my repo to follow the updates!

Home Assistant Configuration Here's my Home Assistant configuration. I have installed HA on a Lenovo ThinkCentre M93P Tiny with an Intel Dual-Core i5-

iLyas Bakouch 25 Dec 30, 2022
hardware design of the 250mm drone

hardware design of the 250mm drone

ZJU FAST Lab 645 Dec 25, 2022
Extremely simple PyBadge examples to demonstrate different aspects of CircuitPython using PyBadge hardware.

BeginnerPyBadge I purchased a PyBadge recently. I'm new to hardware. I was surprised how hard it was to find easy examples demonstrating how different

Rubini LaForest 2 Oct 21, 2021
Provide Unifi device info via api to Home Assistant that will give ap sensors

Unifi AP Device info Provide Unifi device info via api to Home Assistant that will give ap sensors

12 Jan 07, 2023
An emulated LED scoreboard for Major League Baseball ⚾

An LED scoreboard for Major League Baseball. Displays a live scoreboard for your team's game on that day.

Tyler Porter 8 Apr 08, 2022
从零开始打造一个智能家居系统

SweetHome 从零开始打造智能家居系统的尝试,主要的实现有 可以扫码添加设备并控制设备的Android App 可以控制亮灭的灯,并可以设置在Android App连接到指定Wifi后自动亮起 可以控制开关的窗帘,机械结构部分自己设计并3D打印出来 树莓派主控,实现Http请求接口和ZigBe

金榜 5 May 01, 2022
ESP32 recording button presses, and serving webpage that graphs the numbers over time.

ESP32-IoT-button-graph-test ESP32 recording button presses, and serving webpage via webSockets in order to graph the responses. The objective was to t

f-caro 1 Nov 30, 2021
A Python script to monitor the latest block on an LCD.

PiHole-Monitoring A Python script to monitor the latest block on a lcd display. The first number represents the dns queries from the last 24h, the sec

Maxi 4 Dec 05, 2022
A script for performing OTA update over BLE on ESP32

A script for performing OTA update over BLE on ESP32

Felix Biego 18 Dec 15, 2022
Fener ROS2 package version 2

Fener's ROS2 codes that runs on the vehicle. This node contains basic sensing and actuation nodes for vehicle control. Also example applications will be added.

Muhammed Sezer 1 Jan 18, 2022
Home Assistant integration for energy consumption data from UK SMETS (Smart) meters using the Hildebrand Glow API.

Hildebrand Glow (DCC) Integration Home Assistant integration for energy consumption data from UK SMETS (Smart) meters using the Hildebrand Glow API. T

Aniket 153 Dec 30, 2022