Draper Labs / Olin College Robotics Lab

As a member of the Olin College Robotics Lab for 3 years and its technical lead, I partook in various projects with the 2 of those years primarily helping lead GATOR. During this tenure, for a period of 8-10 months a concurrent UROP with Draper Labs was active with them as a sponsor for the project and had us partake in direct Draper Labs work on their related project.

GATOR:
GATOR is a retrofitted John Deere 850D Gator to be an autonomous vehicle used in off-road dataset collection. Over the course of 2 years as technical lead I helped oversee mechanical maintenance and improvement to both the vehicle and autonomy system. Installed various sensors and designed the interior wiring plans to all the components and their corresponding controllers/PCs. Including this helped write the Sense-Think-Act architecture for the project and helped lead the software team specifically after initial autonomy levels were met.

Below is a video from one of our autonomy test out in Parcel B, the privately owned woodland of Olin College of Engineering where this took place.

GATOR System:

Below is full-system diagram of the GATOR Autonomous System. Not all of these components are always in use for autonomy, but all were functional. Some of these components were intended just for data collection. The system operated on a fore-brain, mid-brain, hind-brain architecture. Where the top fore-brain was the interfacing computer to run the missions/send encoded commands to the mid-brain NUC 8 Extreme that had the immediate control of the micro-controllers (hind-brain) and sensors. The client/operator interface would be for the extended customer use, such as interfacing Draper’s system with ours.

The software stack for V1 autonomy was based on a Matlab system on the mid-brain computer, with operator network command input. The Arudino Portenta using custom C++ code for all the attached sensors and safety checks. This version of the system primarily favored a GPS approach with LIDAR and Sonar collision detection.

V2 of the vehicle was done after proven autonomy to modernize to Python3 and ROS2 approach, with an updated ackerman steering model and proper urdf’s of the vehicle for base simulation tasks.

Sensors:

  • Color Camera (Front-Facing)

  • Sick Lidar (2x)

  • RTK GPS

  • IMU

  • URM06 Sonar (4x)

Actuators:

  • Lidar Tilt motors (2x)

  • Steering Motor

  • Brake/Throttle Actuators

Computers:

  • NUC 11/Hooked in laptop - Fore-brain

  • NUC 8 Extreme - Mid-brain

  • Arduino Portenta H7 - Hind-brain

  • 30A Roboclaws (3x) - Motor Controllers

Notable Projects I Did:

  • Custom interface scripting from Arduino Portenta to Roboclaws

  • Network interfacing from Fore-brain to Hind-brain

  • Electrical wiring & Design of whole system

  • Installation of 3rd-party proprietary power-steering kit & interfacing custom electrical interfacing

  • Autonomy/Sense-Think-Act Stack

  • Double RTK GPS Interpolation

Draper Labs:
During a concurrent UROP with Drapers Labs as our sponsor, we helped in various tasks for similar UAV autonomous testing on their system in Parcel B. For a part of this time I was also installed into their team as a part-time intern for a few months where I helped on re-formatting/writing some unit-testing for their UAV simulation system

Due to NDA restrictions, for further inquiries contact me

Previous
Previous

GE HealthCare

Next
Next

TetrisRL