Exploring the ocean depths with a machine-learning robot
Kakani Katija grew up wanting to be an explorer and dreamed of becoming an astronaut. She studied aerospace engineering and then interned at NASA, but her dreams changed when she became fascinated with a place even less well explored than space: Earth's oceans.
The ocean represents the largest habitable ecosystem on the planet, yet less than 5% has been explored, and it is estimated that nearly half of all marine species have yet to be described. Katija is working on new autonomous robotic systems that could help solve this problem.
Her ideas about how to develop an autonomous robot began when Katija was in graduate school. She collaborated with marine biologists Jack Costello at Providence College and Sean Colin at Roger Williams University on a project driven by the U.S. National Science Foundation. The team was interested in jellyfish propulsion and what it might mean for underwater vehicle development. Katija's aerospace engineering background brought an invaluable perspective to the project, and the diversity of the team's perspectives proved invaluable to the success of the research.
At that time, the researchers faced a major challenge in that it was not possible to attach an electronic tag to a soft-bodied marine animal. This obstacle meant that scientists were limited to studying large, hard-bodied vertebrates such as fish and whales and, as a result, were missing a crucial understanding of marine life.
Katija became motivated to find ways to track and understand marine species that weren't large or rigid enough to carry an electronic tagging package. She has been able to advance her research with an NSF award to develop computer vision algorithms to focus on moving marine species, coupled with vehicle control algorithms to pilot the vehicle. Through this combination, Katija, together with Dana Yoerger, a senior scientist at Woods Hole Oceanographic Institution, and other colleagues, created Mesobot, a robotic vehicle that uses stereo cameras to follow and record slow-moving marine creatures autonomously.
However, smarter algorithms using artificial intelligence were needed to identify, understand and search for animals completely autonomously.
AI in the marine environment
These efforts to teach an AI system to identify and follow elusive marine species also required more than computer programmers, engineers and marine technology specialists. Also needed were marine biologists who knew how to identify species and images of these species to train the AI system. Success was only possible due to the community of collaborators from different fields with different skills and distributed labeled datasets such as FathomNet, which Katija and collaborators worked to build. Developing this species-identifying AI meant that a major hurdle had been overcome in the development of a completely autonomous underwater research robot.
This success resulted in additional support from NSF through the Networked Blue Economy awards, grants designed to foster innovative approaches to solve ocean problems by bringing together diverse communities. The next stage of the work, the Ocean Vision AI project, includes not just engineers and marine biologists but also video game designers, photographers, media specialists and ocean enthusiasts. This group of experts will train the AI system to not just identify, but also autonomously follow, marine animals.
If the Ocean Vision AI system is successful, it will allow autonomous underwater robots to boldly go where no one has gone before to discover the secrets of elusive marine species in a way that previously would have been impossible.