Starbax

Northwestern University: Embedded Systems in Robotics

Technical Skills

  • ROS (including custom messages & services)
  • Python
  • Inverse Kinematics & motion planning
  • Image processing & Object Detection using the OpenCV library
  • Augmented Reality Tags
  • Collaborative GitHub usage

Short Summary

In the video below, Baxter begins the coffee making process by picking up the cup and placing it into the Keurig. Next, it raises the lid, picks up the K-cup, and places it inside. Finally, it closes the lid and removes the cup from the Keurig.

My contribution to the project involved programming Baxter to differentiate between the three items on the table. To do this, blue, yellow, and pink Post-it notes were placed by the cup, Keurig, and K-cup respectively. At program startup, Baxter raises its left arm (which has a camera at the end) to obtain a top-down view of the table. Using color segmentation, the centroid of each Post-it note is found in the camera frame and used to determine the relative order of the items. Next, I linked these items to their world frame equivalents (obtained from a teammate's code using AR tags) by comparing the world frame centroids and matching them to camera frame centroids accordingly.

To learn more about the project, click here to see the GitHub repo and access the README.


Demo Video