Final-project-robokeeper created by GitHub Classroom

Related tags

HardwareRoboKeeper
Overview

RoboKeeper!

Jonny Bosnich, Joshua Cho, Lio Liang, Marco Morales, Cody Nichoson


Robokeeper being a boss height=

Demonstration Videos

Equipment

Hardware:
  • HDT Global Adroit Manipulator Arm
  • Intel RealSense Camera
Software:
  • Robot Operating System (ROS)
  • MoveIt!
  • OpenCV
  • AprilTag

Quickstart Guide

  1. Install ROS Noetic on Ubuntu 20.04
  2. Create catkin workspace
    $ source /opt/ros/noetic/setup.bash
    $ mkdir -p ~/catkin_ws/src
    $ cd ~/catkin_ws/
    $ catkin_make
    
  3. Copy this repository into src folder
    $ cd ~/catkin_ws/src
    $ git clone [email protected]:ME495-EmbeddedSystems/final-project-robokeeper.git
    
  4. Install required packages and build
    $ source devel/setup.bash
    $ rosdep install --from-paths src --ignore-src -r -y
    $ catkin_make
    

Running the package

  1. First, run the main launchfile. To run the program on the real robot, run the command below.

    roslaunch robokeeper robokeeper_go.launch
    
  2. If using a simulation, add the sim:=true argument when running the main launchfile.

    roslaunch robokeeper robokeeper_go.launch sim:=true
    
  3. The robot now has to pick up the paddle and this is done with two services. First, call above_paddle.

    rosservice call /above_paddle
    
  4. Next, call the 'retrieve_paddle` service.

    rosservice call /retrieve_paddle
    
  5. Call the reset service to move the robot in front of the goal.

    rosservice call /reset
    
  6. Call start_keeping to enable the goal keeping component of the project.

    rosservice call /start_keeping
    
  7. When finished, call the 'stop_keeping' service.

    rosservice call /stop_keeping 
    

Launchfiles

robokeeper_go.launch

This is the main launchfile used to operate robokeeper. It starts by launching robokeeper_moveit.launch which loads the necessary urdf file and hardware configuration, as well as the main MoveIt! executable. It then launches intel_cam.launch which starts the Intel Realsense camera. It also starts a transforms node which handles the calculation of transformation between various frames within the world. Finally, the launchfile starts a motion_control node that publishes appropriate joint state messages to actuate the arm.

robokeeper_moveit.launch

This launchfile loads robot description for the Adroit 6-dof manipulator arm, as well as its hardware and controller configuration from the hdt_6dof_a24_pincer_description package. It also includes move_group.launch from the hdt_6dof_a24_pincer_moveit package, which starts the move group that MoveIt! uses to plan the motion of the arm.

intel_cam.launch

This launchfile starts the Intel Realsense camera by launching rs_camera.launch from the realsense2_camera package. It then launches AprilTag_detection.launch for AprilTag integration.

AprilTag_detection.launch

This launchfile loads parameters necessary for integrating AprilTag detection, which is crucial for detecting the position of the robot relative to the camera. It starts apriltag_ros_continuous_node from the apriltag_ros package.

Nodes

perception

The perception node is responsible for handling the data collected from the Intel RealSense camera utilized to identify and locate the objects that our robot is tasked with blocking. It contains a CV bridge to enable OpenCV integration with ROS, subscribes to the RealSense's camera data, and ultimately publishes 3-dimensional coordinate data of the centroid of the object of interest (a red ball for our purposes).

In order to identify the ball, video frames are iteratively thresholded for a range of HSV values that closely match those of our ball. Once the area of interest is located, a contour is created around its edges and the centroid of the contour located. This centroid can then be treated as the location of the ball in the camera frame and published appropriately.

transforms

Knowing where the ball is relative to the camera is great, but it doesn't help the robot locate the ball. In order to accomplish this, transformations between the camera frame and the robot frame are necessary. This node subscribes to both the ball coordinates from the perception node and AprilTag detections, and publishes the transformed ball coordinates in the robot frame.

In order to complete the relationship between the two frames, an AprilTag with a known transformation between itself and the baselink of the robot (positioned on the floor next to the robot) was used. Using the RealSense, the transformation between the camera frame and the AprilTag can then also be determined. Using these three frames and their relationships, the transformation between coordinates in the camera frame and coordinates in the robot frame can finally be determined.

motion_control

This node provides the core functionality of the robokeeper. Primarily, it subscribes to the topic containing the ball coordinates in the robot frame and contains a number of services utilized to interact with its environment in several ways.

The main service used is /start_keeping. As the name suggests, this service allows the robot to begin interpreting the ball coordinates and attempting to intersect it at the goal line. Appropriate joint trajectory commands are sent to the robot through a mix of MoveIt! and direct joint publishing (depending on the service called) in order to accomplish the task. This node also keeps track of goals scored by determining if the ball has entered the net.

Services

  1. The reset service moves the Adroit arm directly in front of its base and the goal.

    rosservice call /reset
    
  2. The keep service moves the robotic arm to a pose that is only dependent on a y-value. An example of the service being called follows.

    rosservice call /keep "pos: 0.0"
    
  3. above_paddle is a service that moves the arm directly above the paddle holster to get in a position for consistent retrieval.

    rosservice call /above_paddle
    
  4. To retrieve the paddle, the retrieve_paddle can be called. It moves the arm to a postion where it can grip the paddle, it then closes the gripper, and finally moves to the same position as above_paddle.

    rosservice call /retrieve_paddle
    
  5. The start_keeping service enables the robot to block the red ball from entering the goal.

    rosservice call /start_keeping
    
  6. To stop the robot from moving and tracking the ball, call the stop_keeping service.

    rosservice call /stop_keeping 
    

Additional Notes

There are some features within this code that were partially developed, but not completed due to time contraints. Because of this, you may notice certain things in the source code that are not mentioned here.

An example of this is the scoreboard feature. The original plan was to include both a goal counter and block counter when playing with the robot and display these stats to the user in order to create a game. The goal counter was successfully created, but we didn't have time to complete the black counter. The goal counter is located within the 'motion_control' node and the infrastructure for displaying the actual scoreboard using the 'tkinter' library is located in a node called 'scorekeeper'.

Owner
Cody Nichoson
Cody Nichoson
ArduinoWaterHeaterIOT - IoT Probe of a solar PV water heating system - Arduino, Python, MQTT, MySQL

ArduinoWaterHeaterIOT IoT Probe of a solar PV water heating system - Arduino, Raspberry Pi, Python, MQTT, MySQL The Arduino sends the AC and DC watts

Jacques Fourie 1 Jan 11, 2022
Philippe 1 Jan 09, 2022
This repo uses a stereo camera and gray-code-based structured light to realize dense 3D reconstruction.

Structured-light-stereo This repo uses a stereo camera and gray-code-based structured light to realize dense 3D reconstruction. . How to use: STEP 1:

FEI 20 Dec 31, 2022
Hook and simulate global keyboard events on Windows and Linux.

keyboard Take full control of your keyboard with this small Python library. Hook global events, register hotkeys, simulate key presses and much more.

BoppreH 3.2k Dec 30, 2022
Code for the paper "Planning with Diffusion for Flexible Behavior Synthesis"

Planning with Diffusion Training and visualizing of diffusion models from Planning with Diffusion for Flexible Behavior Synthesis. Guided sampling cod

Michael Janner 310 Jan 07, 2023
A Raspberry Pi Pico plant sensor hub coded in Micropython

plantsensor A Raspberry Pi Pico plant sensor hub coded in Micropython I used: 1x Raspberry Pi Pico - microcontroller 1x Waveshare Pico OLED 1.3 - scre

78 Sep 20, 2022
emhass: Energy Management for Home Assistant

emhass EMHASS: Energy Management for Home Assistant Context This module was conceived as an energy management optimization tool for residential electr

David 70 Dec 24, 2022
A iot Bike sytem based on RaspberryPi, Ardiuino

Cyclic 's Kernel ---- A iot Bike sytem based on RaspberryPi, Ardiuino, etc 0x1 What is This? Cyclic 's Kernel is an independent System With self-produ

Retr0mous 2 Oct 09, 2022
LT-OCF: Learnable-Time ODE-based Collaborative Filtering, CIKM'21

LT-OCF: Learnable-Time ODE-based Collaborative Filtering Our proposed LT-OCF Our proposed dual co-evolving ODE Setup Python environment for LT-OCF Ins

Jeongwhan Choi 15 Dec 28, 2022
Baseline model for Augmented Home Assistant

Dataset Preparation Step 1. Rename the Virtual-Home output directory to 'vh.[name]', for example: 'vh.door' Make sure the directory contains 100+ fram

Stanford HCI 1 Aug 24, 2022
SPI driven CircuitPython driver for PCA9745B constant current LED driver.

Introduction THIS IS VERY MUCH ALPHA AND IN ACTIVE DEVELOPMENT. THINGS WILL BREAK! THIS MAY ALSO BREAK YOUR THINGS! SPI driven CircuitPython driver fo

Andrew Ferguson 1 Jan 14, 2022
This is an incredible led matrix simulation using the ultimate mosaik co-simulation framework.

This project uses the mosaik co-simulation framework, developed by the brilliant developers at the high-ranked Offis institue for computer science, Oldenburg, Germany, to simulate multidimensional LE

Felix 1 Jan 28, 2022
I made this so I can control my Tapo L510 light bulb and Govee H6159 light strip using the PyP100 module and the Govee public API

TAPO-And-Govee-Controller I made this so I can control my Tapo L510 light bulb and Govee H6159 light strip using the PyP100 module and the Govee publi

James Westhead 0 Nov 23, 2021
This tool emulates an EMV-CAP device, to illustrate the article "Banque en ligne : à la decouverte d'EMV-CAP" published in MISC

About This tool emulates an EMV-CAP device, to illustrate the article "Banque en ligne : à la decouverte d'EMV-CAP" published in MISC, issue #56 and f

Philippe Teuwen 28 Nov 21, 2022
CircuitPython Driver for Adafruit 24LC32 I2C EEPROM Breakout 32Kbit / 4 KB

Introduction CircuitPython driver for Adafruit 24LC32 I2C EEPROM Breakout Dependencies This driver depends on: Adafruit CircuitPython Bus Device Regis

foamyguy 0 Dec 20, 2021
Using a raspberry pi, we listen to the coffee machine and count the number of coffee consumption

A typical datarootsian consumes high-quality fresh coffee in their office environment. The board of dataroots had a very critical decision by the end of 2021-Q2 regarding coffee consumption.

dataroots 51 Nov 21, 2022
An alternative to Demise-Assistant-Batch made entirely in Python for more capabilities.

Demise-Assistant-Python An alternative to Demise-Assistant-Batch made entirely in Python for more capabilities. IMPORTANT NOTE Demise-Assistant-Batch

SkelOrganisation 1 Nov 24, 2021
Huawei Solar sensors for Home Assistant

Huawei Solar Sensors This integration splits out the various values that are fetched from your Huawei Solar inverter into separate HomeAssistant senso

Thijs Walcarius 151 Dec 31, 2022
Inykcal is a software written in python for selected E-Paper displays.

Inykcal is a software written in python for selected E-Paper displays. It converts these displays into useful information dashboards. It's open-source, free for personal use, fully modular and user-f

Ace 727 Jan 02, 2023
Electrolux Pure i9 robot vacuum integration for Home Assistant.

Home Assistant Pure i9 This repository integrates your Electrolux Pure i9 robot vacuum with the smart home platform Home Assistant. The integration co

Niklas Ekman 15 Dec 22, 2022