BotChocolate

Computer Vision, OpenCV, ROS2, Python, Motion Planning, Moveit, Intel Realsense, Emika Franka Robot Arm

Overview

The purpose of this project was to create hot chocolate using the Franka Emika 7 DOF robot arm. To perceive its environment, the system utilizes an Intel D435i camera AprilTags. Upon initial setup, a calibration process must be completed. After this, the process consisted of using AprilTags to locate a scoop of cocoa, a mug, a spoon, and a kettle relative to the robot. Next, using our custom MoveIt API for Python, movebot, the robot is able to perform path planning between all of these objects. It turns on the kettle, dumps the cocoa powder into the mug, pours the hot water over the power, and stirs the mixture it with the spoon.

Moveit API

Because the ROS2 MoveIt package does not have a Python API yet, creating one was the first step. By looking into the MoveIt ROS2 actions and services, we were able to create an API that takes either Cartesian coordinates of the Franka end-efffector or joint-state positions of each joint and creates a path to the specified position. It also allows for the choice of moving in a Cartesian path.

Vision

The vision system is comprised of two major parts: calibration and component location. AprilTags are used to detect where each of the hot chocolate making components are. The AprilTags are located using the apriltag_ros package.

There is one tag fixed to a kettle and one tag fixed to a jig that has slots for a mug, cocoa scooper, and a spoon. Once these tags are found in the camera frame, there are two transformation trees describing where all the hot chocolate components are relative to the camera and where each link of the Franka robot arm is relative to the robot’s base. However, we need to connect these trees to get the location of each component relative to the robot’s base. This is where the calibration is used. An AprilTag must be placed in the robot’s gripper, aligned with the gripper’s coordinate frame and in view of the camera to calibrate. We then wrote a program that finds the location of the tag and relates it to the position of the base, connecting the robot frame to the camera frame. We then save this transformation to a .yaml file to be used later on when running the main hot chocolate making sequence.

After obtaining the transformation from the camera to the robot, the robot can now use the locations of the two AprilTags to locate and manipulate the hot chocolate components as necessary.

Motion Planning

Once the locations of all the components are known, it’s time to make hot chocolate. Cartesian paths are used to move the robot in straight lines and rotational path planning is used for achieving motions like tilting to grab the scoop and dumping the cocoa. Both Cartesian and rotational motion are used to achieve pouring. To make a cup of hot chocolate, the robot first turns on the kettle to heat the water, grab the cocoa scoop, pours it in a mug, pours the hot water over the powder, and stirs. We also used serval intermediate waypoints to help avoid reaching the joint limits of the robot.

The Team

  • Shantao Cao
  • Allan Garcia-Casal
  • Nicholas Marks
  • James Oubre
  • David Dorf

bot_choc-min