Update from Olin College: Spring Semester, Snow, and SnotBot

If you haven’t yet heard about SnotBot, it has been an ongoing partner project with Ocean Alliance and Olin College of Engineering in Needham, MA. The goal – to create a robotic research assistant for field research voyages that can safely and efficiently collect whale blow – has been being tackled by several groups of research students over the last year. The fleet, a set of small multirotor drones affectionately named SnotBots, are equipped with various sensors in order to run human-program missions or ‘think’ for themselves during autonomous missions.

Throughout the Fall Semester, the SnotBot team at Olin College was working on getting a new team up to speed and setting up for this semester. We spent those twelve weeks gathering documentation sources, writing papers, downloading new software, redesigning SnotBot landers, outfitting SnotShot with sensors- the works!

Now, the team is in a place to hit the ground running this semester with the following goals in mind:

  • Develop reliable remote control systems (so a human pilot may override the autonomy at any time)
  • Develop reliable point-to-point mission navigation (so a SnotBot can be told where to go, and actually get there to collect data)
  • Develop a first round of visual navigation systems (so a SnotBot can look around and determine what is something interesting to navigate to)
  • Create a waterproof gimbal housing
  • Create a launcher/lander mechanism (so when launching from or landing on a boat, the SnotBot can reliably/accurately take off and land without human assistance)

Since the start of the semester, the team has managed to set up a new ground control station, which can be used on any laptop running a Windows Operating system, with a joystick controller – now flying the drones will be a lot like flying in a simulator, or flying a starship in a video game. The basic planner, Mission Planner by Ardupilot, will take in the data from the SnotBot brain, and send back control signals during flight. The team can write their own missions, control signals, or commands within the program – or for more control and accuracy, in self-authored Python scripts. Benchtop tests of a program to launch the SnotBot, hover, and land are promising.

oa1

 

oa2

Views of our benchtop test location, and our new ground control station running Mission Planner by Ardupilot, our self-authored Python scripts, and interfacing with a normal joystick controller.

 

 

As the snow fell in New England, the team received two new software members who will be working on computer vision tasks, and communications protocol. The computer vision team has already been able to use computer packages to identify QR codes, which we will use as fiducials – signposts for the SnotBot – during point-to-point navigation tests using the cameras mounted to the chassis.

 

oa5

Team member Jay (‘17) holds up a QR code for identification as Victoria (‘16) snaps a quick photo. The lines you see are tracking matching keypoints on the QR code. These will later be used to help identify the angle, distance, and orientation to the fiducials on the ground during flights.

oa4

 

To protect those cameras, our mechanical team is wrapping up design work from last semester on a waterproof gimble mount, that could be used on any general chassis with small modification. Right now, the gimbal is ready for some dunk tests, and SnotBot Gray is up for modification. New legs will be reprinted for Gray to accommodate for the size of the new gimbal housing.

oa6

The ‘Bubble’ that will protect the cameras on future SnotBots.

 

As you look forward to the next weeks, expect some videos of autonomous test flights, flyovers with our SnotShot, new sensors, new SnotBot fleet members, and more!