Introduction to the Robot Operating System (ROS)
by Gazza
It has been a minute since I submitted an article and for that I apologize. For reference, my last submission was published in Volume Thirty-Four, Number Four. I was inspired to write this after reading about Amazon's Astro robot1.
In short, with my interest in robotics, I wanted to see how many of Astro's features I could replicate with open-source software on my new Create 3 robot2.
Personally, I have been working in the field of robotics since 2016. I work mainly with the Robot Operating System (ROS)3. ROS had its start with a company called Willow Garage back in 2010. Now ROS is managed under the Open-Source Robotics Foundation (OSRF)4.
The ROS releases are typically tied to a specific Ubuntu distribution and are alphabetical in nature. Thus, when I started playing with ROS in 2016, I was using the Kinetic Kame, which ran on Ubuntu 16.04 (Xenial). Note the OS portion of ROS is a bit of a misnomer, if you are a Linux distro hopper like myself. ROS is more akin to a Linux package than an OS.
Presently, the latest (and last) release of ROS 1 is called Noetic Ninjemys and is supported until May of 2025. Noetic runs on Ubuntu 20.04 (Focal) release. It can also be installed on Debian. Windows 10 and Arch Linux installations are marked as experimental.
Note that ROS 2 Ardent Apalone, a complete rewrite of ROS 1, was released in 2018. Presently, at the time of writing there are three ROS 2 releases: Rolling Ridley which, as its name implies, is a rolling release, Iron Irwini which has an EOL of November 2024, and Humble Hawksbill which has an EOL of May 2027.
The newest Jazzy release will be out when this article publishes. Note ROS releases typically alternate between an LTS version and an updated version similar to the Ubuntu cycle. As a personal preference, I typically stick to the LTS releases. Also, I am currently making the switch to ROS 2, so I feel more comfortable to continue focusing on ROS 1 for this article.
My very first robot was a TurtleBot 2, think Roomba, equipped with a Kinect for Xbox 360. I make the distinction between Kinect for Xbox 360 and Kinect for Xbox One because they used different technologies. The 360 version used structured light.
In short, structured light projects a bunch of dots in a specific pattern. Deviations in the pattern is how the Kinect detects the user and calculates how far away they are from the camera. The typical range of a Kinect sensor is six to eight meters.
Note that the Xbox One version used time of flight to detect the user. I chose the Kinect since you could pick one up for around $200 USD. While a 360 degree 3D LiDAR would be a better choice, in 2016 the cost of a Velodyne VLP 16 was well over $10,000 USD (presently you can pick up a 16-beam 3D LiDAR for around $3500 USD). Also for reference, a 360 degree 2D LiDAR is now around $300 USD.
On the TurtleBot 2, I used the Kinect for obstacle avoidance as well as to map the environment; more on that later. Note the structured light version of the Kinect would not work outdoors, but the time of flight version did a better job in sunlight. On my new Create 3 robot, I use either Intel RealSense D435 or an OAK-D camera from Luxonis.
Presently, I am primarily interested in navigation and autonomy so that is the first of Astro's features I would like to replicate. There are many different levels of autonomy from Level 1, think of a remote controlled car, to Level 8, what you see in movies. These levels are based on a paper published by Bostelman and Messina5.
Let's start with Level 1.
In ROS, you can use the packages teleop_twist_keyboard6 or teleop_twist_joy7 to drive the robot around with either a keyboard or joystick respectively. Both of these packages publish the topic cmd_vel.
At its core ROS is a messaging system that subscribes and publishes messages. In this case the cmd_vel topic is what gets the robot to move.
Note however with teleop_twist_keyboard or teleop_twist_joy the robot will behave like a remote controlled car and run into things if you're not careful. Now to get to an autonomous level of 6, we need to use a ROS package called move_base8.
The move_base package subscribes to a move_base_simple/goal topic and publishes a cmd_vel topic. How do we generate a move_base_simple/goal topic?
In short, we use a package called RViz that allows you to visualize ROS topics in a 3D environment9.
For our intended use case RViz has a button called 2D Nav Goal which we can use to set a way point or move_base_simple/goal topic on a map. Where does the map come from? One way is to use Simultaneous Localization And Mapping (SLAM) to create one as you explore.
Making a map is the second of Astro's features I want to replicate.
Note these maps are typically 2D and have three values; specifically NO_INFORMATION, FREE_SPACE, and LETHAL_OBSTACLE.
A common SLAM package is gmapping10. The gmapping package subscribes to a scan topic and uses the data to publish a map topic. The move_base package can also subscribe to this map topic and makes two cost maps, specifically a global_costmap that is typically static, and a rolling local_costmap that moves with the robot.
Thus when you drop a way point with the 2D Nav Goal in Rviz, move_base uses a global_planner (the default is NavfnROS) on the global_costmap to plan a path for the robot to reach the way point.
Simultaneously, move_base also uses the local_costmap and a local_planner (the default is TrajectoryPlannerROS) to avoid any obstacles encountered along the global_path.
Obstacles are detected in the obstacle_layer of move_base and typically uses either LaserScan or PointCloud2 messages to identify obstacles.
Note the Kinect, D435, and OAK-D are all capable of generating a PointCloud2 topic. The move_base package also has an inflation_layer that can be used to add additional padding to the map or obstacles the sensors identify to keep the robot from running into things.
Setting way points is typically how I operate the robot, but we can take it a step further. We can also let the robot explore on its own using explore_lite11 getting us to Level 7. The explore_lite package uses frontiers to set its own way points. A frontier is defined as the boundary between NO_INFORMATION and FREE_SPACE on the occupancy grid. The longer the frontier the higher the priority and continues until all frontiers are cleared.
The ability to patrol is another of Astro's features I want to replicate. We could do this with a series of way points, but I would have to ensure that the way points were clear of obstacles at all times. The advantage of using explore_lite over way points for patrolling is I can move my furniture whenever and wherever I want. If you want your robot to auto dock when it's low on battery, then we can use a ROS package called apriltag_ros12.
The assumption is we have an AprilTag on the docking station. If we want to have the robot recognize objects, then we could use the ROS package find_object_2D13.
However, we would need a picture of the object the robot is looking for. Note that there are definitely more robust approaches to object recognition such as YOLO that may be a better fit for your use case.
Astro has quite a few other features that look really interesting, like a camera on a periscope, but I think that is a good start for now and I should probably wrap up this article. Stay tuned for my next submission, already in progress, on how to set up a simulated robot in ROS with most of the aforementioned packages.
As I finish this, I am now getting, "Have you seen this???" messages from coworkers and friends in regard to a flame throwing robotic quadruped. Specifically, the Thermonator from a company called Throwflame14. The price tag, at the time of writing, is just under $10,000 USD. Note that the ARC flamethrower can be purchased separately for under $1000 USD.
Now I am thinking of how could I mount one to my Create 3. If I did, it would definitely need a catchy name like ThermoRoomba or Thermoomba.
References
- How Amazon is Enhancing Astro for the Home and Beyond
- Introducing the Create 3 Educational Robot
- Robot Operating System
- Open Robotics
- A-UGV Capabilities - Recommended Guide to Autonomy Levels
- teleop_twist_keyboard
- teleop_twist_joy
- move_base
- rviz
- gmapping
- explore_lite
- apriltag_ros
- find_object_2d
- ThrowFlame