At the recent AWS re:Invent 2018 conference, Amazon introduced AWS RoboMaker, a service that simplifies the development, testing and deployment of intelligent robotic applications at scale. RoboMaker includes extensions for the Robot Operating System (ROS) that enable cloud connectivity to AWS to take advantage of machine learning, cognitive, monitoring and analytics services.
Traditionally, it has been difficult to build robotic applications due to fragmented tooling for development, simulation and fleet management. This friction has created many inefficiencies. Roger Barga, general manager at AWS RoboMaker, explains:
Building a robotics application and addressing these tooling challenges take time. These delays offer little value to the application that the robotics developer is trying to build, we call this undifferentiated heavy lifting at AWS. The problem is these challenges leave very little time for the developer to build intelligent robotic applications. Using AWS RoboMaker, developers have more time to innovate in robotics and their applications.
Network connectivity has traditionally been a barrier for robotic applications to consume cloud-based services. However, Barga states this is no longer the case:
Network connectivity is becoming more pervasive and is improving. Connected devices can communicate with the cloud with millisecond latency and that is dropping all the time. This is a game changer for robotics because a developer is no longer limited to the physical and software resources on the device itself.
Amazon created AWS RoboMaker by extending ROS to include extensions for AWS cloud services. These extensions are bundled into ROS packages that existing ROS developers are familiar with. When developers include these packages within their development environment, it unlocks access to many Amazon cloud services including:
- Amazon Lex - speech recognition
- Amazon Polly - text to speech generation
- Amazon Kinesis Video Streams – for securely streaming video for analytics and machine learning
- Amazon Rekognition – image and video analysis, identifying objects
- Amazon Cloudwatch – logging and monitoring
Accessing AWS cloud services enables use cases, including talking to a robot and having it understand what you are saying. The robot can subsequently respond back to a user asking for further information or acknowledging commands. The robot can also stream LIDAR and radar from the camera and to the cloud where the data can be analyzed. In addition, objects can be identified through cognitive services offered by Rekognition and Cloudwatch will be able to track where all of your robots are.
Beyond cloud extensions for ROS, AWS RoboMaker is broken-up into three additional areas including Development Environment, Simulation and Fleet Management.
Image source: https://aws.amazon.com/robomaker/
Development environments allow teams to configure compute, storage and download ROS for the robot you are about to build. From the development environment, developers can see all of the ROS packages, to get started right away. Amazon claims teams can get setup in minutes.
Scalable simulators allow developers to test their applications before hardware is selected. Pre-built worlds including homes, retail stores and race tracks are available. In addition, you can load your own room. Simulators also allow for different configurations which can run in parallel, allowing you to run thousands of hours of simulation in a single hour.
Fleet Management is available so administrators can manage their robots from a single management plane using over-the-air installations and updates. These fleet management capabilities are available through AWS IOT Greengrass, which includes a registry of all robots and provides fault tolerance when pushing updates. In addition, Greegrass enforces security to prevent others from installing unauthorized software on robots.
Amazon has customers who have implemented RoboMaker including Robot Care Systems, Future Robots and Advanced Robot Solutions which are all using these services to add more intelligence to their robots. Black & Decker and NASA JPL are using simulation today for more agile development and to test real-world applications like inspection drones and the NASA Mars rover. In addition, NASA has open-sourced their Mars rover implementation (pictured below) allowing other organizations to build their own robot using AWS RoboMaker.
Image source: (screenshot) https://www.youtube.com/watch?v=sjxZAdm1utM
Amazon is also making open-source contributions by offering all of the cloud extensions and documentation they have written. In addition, Amazon has joined the ROS2 Technical Steering Committee (TSC) as a partner.