As you are all aware, these trips by the nature of what we hope to achieve are always an adventure. This trip has been about the most challenging that I have ever done, a live TV show, partnering with INTEL, and testing four new drone setups and two new drones in Alaska!?!
The whale gods have been on our side, and we had another first yesterday: we collected Snot from orca whales using a DJI Mavic Pro – just amazing. Even I had doubts that we could do this, but there was Snot in the dish so another first for the Parley SnotBot.
Our little boat was certainly full with the Parley SnotBot team, Alaska Whale Foundation, and the INTEL team on board, but what a great group of people.
Following is a blog from our newest friend Ted Willke, Senior Principal Engineer and Director of the Mind’s Eye Lab INTEL. Our thanks go out to Bryn Keller and Javier Turek.
Coming into a new collaboration is never easy, especially when it involves subjects as diverse as marine biology, drones, and artificial intelligence. But it’s a necessary dance if you want the kind of innovation we’re after — the kind that happens when diverse scientific fields collide.
This is the dance our Intel Labs team found itself in with Parley and Ocean Alliance this past month. Javier Turek, Bryn Keller, and I were introduced to Iain Kerr and his team by Parley for the Oceans at the World Oceans Day conference at the UN. The question posed to us was: How can artificial intelligence (AI) advance Ocean Alliance’s mission and whale biology … in the next 30 days??!! We soon realized that we had a lot to learn about whales and Parley SnotBots, and fast! But it was also immediately clear that our research team had a lot to offer.
Simply put, AI technology aspires to imbue machines with cognitive skills, like visual recognition. AI equipped Parley SnotBots would clearly be a game changer for OA, even given OA’s existing game-changing techniques. Today’s Parley SnotBots have cameras that are used for piloting and video capture. But there’s a strong desire to do more with this data. As Fred Sharpe put it, “In the age of modern sensors, we’re in a data maelstrom. The real action is in the downstream processing.” And we knew that the right AI could take it on.
The stage was set, and we had two seriously ambitious goals: 1) finding a way to identify whales using images transmitted by Parley SnotBots, and 2) calculating a whale’s relative body composite index, a measure of its energy reserves and condition, from streaming video. Our team, with its background in computing and machine learning techniques, felt up to the task.
But you’ve got to understand: these are tough tasks for AI even with ideal data and carefully groomed algorithms running on machines back in our lab. So trying to solve such problems in a completely uncontrolled environment (weather, water, whales, drones) on a small ship out on the rough sea is nuts. Trying to get something together in four weeks for the National Geographic Earth Live broadcast — TOTALLY BANANAS!
There were other difficulties that we didn’t completely understand until we got to Kake, Alaska. Ships like the Alaska Whale Foundation’s Paula T are like drone aircraft carriers. The pace is fast and the space is cramped. Any new technology has to be unintrusive, field-friendly and fuss-free. Otherwise, it’s going overboard!
To complicate things further, we were still hacking code as we arrived in Kake. We had never run the whale ID algorithm on images taken by a drone or fully validated the volumetrics analysis. We really didn’t know if this stuff worked. And we had never integrated our systems with Iain’s.
We figured out how to set up what amounted to a small computer lab on a ship we’d never seen. The Earth Live dress rehearsal on July 8 came and went. We continued to sweat it out. With the Nat Geo team breathing down our necks, we hacked and hacked.
Then it was show time. I won’t recap the gut-wrenching Earth Live affair since Iain described it in his recent post. But I will say that it was one of the most harrowing adventures I’ve ever experienced. We wrapped up our development as the show began. By the end, Iain’s team had pulled off a Parley SnotBot collection miracle and our algorithms had made a positive ID on the same whale before the drone landed — a scientific first! (See photo at beginning of post)
You’d think things would’ve let up after the Nat Geo team said goodbye, but they haven’t. With the show out of the way, we immediately returned to the primary research mission and our associated computer science research. With just a few days to collect the data needed to power the research for the next few months (and before the next expedition!), it has not been easy. As Bryn put it, “Writing code while tracking whales around at 35 knots (on occasion) over rough waters (a lot) is really interesting.”
Even though it’s been a grind and major adrenaline dump, our enthusiasm remains unhampered. How could it be otherwise, with whales breaching, lunge feeding, and checking out our boat? Quoting Javier as he watched a humpback lunge feed in Keku Strait just a hundred feet from our boat on the morning of our last day, “This is FREAKING AWESOME!!!”
We’ll be back and we’ll be packing more AI when we do.
Best Fishes from Alaska.