This post originally appeared on the IQT Blog.
With the overwhelming pace of technological change, does the story of a tool matter as much as the tool itself? To explore this question, we are inviting writers, makers, and other creatives to help us get out of the technical weeds, see the bigger picture of emerging tech, and understand why it matters (or not) in our daily lives.
In this post, science writer Shannon Fischer talks with Mike Chadwick and Rob Caudill about the AI Sonobuoy project, a low-cost hydrophone that uses AI to process sound data collected at sea. If you are interested in collaborating with us on future projects or storytelling, get in touch at email@example.com
It’s unusual for the engineers and data scientists of IQT Labs to deliberately drop their finely crafted electronics into large bodies of water. But for one of the Labs’ latest creations, the AI Sonobuoy, getting tossed out of a boat was exactly what was needed.
The AI Sonobuoy is IQT Labs’ latest exploration in edge technology. It builds on the work of the Teachable Camera, the lab’s 2020 video camera that uses computer vision AI on the camera to learn to identify customized objects of interest, like cars—or rabbits.
The core advance for both of these projects is what’s called Tiny Machine Learning, or TinyML—literally AI hardware shrunken down so much that it can be run on physically tiny, inexpensive, low-powered chips. This means that where traditional AI requires heavy computing and server capacity, plus power to run it, TinyML processing can happen on sensors deployed in the field—the ‘edge’—running on inexpensive parts that anyone can order off the Internet.
Mike Chadwick, Deputy Director of IQT Labs, calls TinyML a compelling shift in the capability in AI. “It really changes the types of things you might think to build once you realize this is so accessible,” he says. The Sonobuoy project, he explains, serves as a way for IQT to explore that new capability, demonstrating what is possible on this new frontier in AI.
For a couple of reasons, the maritime environment offered an intriguing platform on which to experiment with TinyML. Boats are easily available and their propellers make distinctive sounds in the water, which the IQT Labs team reasoned they could use as “wake words”—just as ‘Hey Siri’ rouses Apple’s AI. Second, activity in the marine setting is tracked by what’s called the Automatic Identification System or AIS, a positional signal broadcast used by nearly all commercial and many recreational craft to prevent collisions, meaning all audio of boats could be cross-checked with official data confirming if a craft was present.
Before they even started designing, the team did a dry run with a Makerbuoy, an open-source creation by Wayne Pavalko at Johns Hopkins University Applied Physics Laboratory. Versions of Makerbuoys have been successfully deployed for multiple years and thousands of miles; the IQT Labs’ Maker, however, thrown off a boat outside of Miami, made it as far as Jacksonville before it disappeared, never to be seen or heard from again. “We learned that we’re pretty good at AI,” Chadwick says. “We are not as good at making things float.”
That failure however, proved a valuable first lesson in what it means to work on the edge. Because devices may not be recoverable, the team needed to brainstorm every type of failure imaginable, which in this case meant anything from failing to waterproof the device adequately, to a shark attack. By the time they were ready to engineer the AI Sonobuoy, Chadwick and his team had already thought through almost every way that things could go wrong with the device.
Ultimately, the project has resulted in two separate but related devices. The Detector is the sonobuoy part of the project, cheap and expendable if lost. The exterior is comprised of a length of PVC pipe, equipped with an underwater microphone (hydrophone) that connects to an Adafruit microcontroller running the TinyML program (hardware listed at end, and more technical info available here). When a noise occurs nearby, the hydrophone runs the noise through its ML model of boat noises. If it recognizes a boat sound, it alerts IQT currently through a cell connection—but the final version will be satellite-enabled; sounds that aren’t a boat get ignored. A small basic lithium battery powers the whole thing for up to two to three months.
The Collector is comprised of a plastic container that houses a Raspberry Pi, an AIS receiver, and a solar-powered battery (additional technical material here). It sits on a dock, hydrophone dangling into the water, recording audio and AIS data to upload to the cloud for labeling for the ML model. (Keeping the Collector safe and dry on a dock, incidentally, is a design decision made after the first incarnation of the Collector was positioned underwater. That Collector disappeared for 11 days in the Columbia River, before returning, full of water, in the hands of a kindly fisherman.)
All the hardware powering both devices is inexpensive, commercially available, hobby-grade material sourced from the broader tech community. Rob Caudill, Technical Director at IQT Labs who runs the Sonobuoy project, explains that this is part of a growing theme throughout the lab. Often, when there’s nothing available off-the-shelf that meets the requirements of a specific mission need, it means a long, expensive, multi-step project with specifications, developers, and so on. This newer approach fills the gap between bespoke and off-the-shelf. “I see it as an in-between solution,” Caudill says. “If I put three or four things together, I can fairly quickly develop a system that meets those mission needs.”
At this point, the AI Sonobuoy project is about six months old. All components have been tested individually, and now Caudill’s team is working to integrate them. The team has just embedded the TinyML model into their sonobuoy for the first time and are testing to see how well it actually works.
Meanwhile, a version of the Collector has spent the last two months on a dock collecting boat sounds and AIS locations. For the moment, all the steps to get the audio and AIS data from the collector to detector are still manual: a human helps to upload data to the cloud, feed it into the model generator, and get it on the buoy. However, as IQT Labs’ successfully prototyped in an earlier visual AI project, SkyScan, which uses a similar collision avoidance system as the maritime AIS, the team’s final vision for the Sonobuoy project includes fully automating that process. In theory, this type of closed-loop AI could be applicable to any kind of data that can be externally validated, explains Caudill. “Eventually, you would have the collectors for the environment that you want to characterize, and then you would be able to deploy the detectors independently.”
The ease of use and accessibility of automation and TinyML offer incredible new potential in AI applications. Other groups tinkering with similar technology have already applied it to agriculture for intelligent water sensing, machinery for fault detection in wind mills, and public health measures that target mosquitos.
But it’s also crucial to note that the tiny size and minimal power needs make this tech inherently less powerful than a traditional, computation-heavy AI model. The AI Sonobuoy, for instance, can detect the presence of the boat, but—at least for now, it cannot tell you what kind of boat. In other words, TinyML does not replace classic AI, but expands it.
Chadwick is excited about the future. “We’re just scratching the surface,” he says. “The potential is largely untapped right now. I think there’s going to be a lot of really interesting products coming out of the commercial industry over the next five years.”