Research & Innovation

These Underwater Robots Are Mapping Where You Don’t Want To

Stevens faculty-student team develops hardware innovations and AI algorithms to improve underwater mapping and monitoring

A mucky river bottom. The underside of an offshore fish farm or wind turbine. Rocks, reefs and shipwrecks in a harbor. Oil platforms. Those places are hard — even dangerous, sometimes — for humans to navigate in, look at and map.

Robotics technology has moved forward in leaps and bounds, of course. Robots now do some of this work for us. But it’s still very hard to see down there, where even a good camera with a light might only see a few feet at a time. It’s also extremely easy for an underwater drone to get lost or snagged.Brendan Englot demonstrating an AI-driven underwater mapping robotics systemMechanical engineering professor Brendan Englot demonstrates his AI-driven underwater mapping robotics system

Now Stevens robotics and AI expert Brendan Englot and his team think they’ve found a better way.

“We’ve built a one-of-a-kind ROV that has a very unique set of sensors that allow it to have better sonar-based perception and mapping capability than any other similar ROV that exists right now,” explains Englot, also directs the Stevens Institute for Artificial Intelligence (SIAI).

“We’re taking off-the-shelf components and putting them together in a new way that enables the ROV to do things it hasn’t been able to do before.”

Testing in wave tanks and harbors

Working with doctoral candidates Ivana Collado-Gonzalez and Paul Szenher and recent graduate John McConnell Ph.D. ‘23 (now teaching at the U.S. Naval Academy), Englot’s group is attempting to make autonomous underwater ROVs (remotely operated vehicles) much smarter and much better equipped to do jobs like inspection and maintenance.

“Eventually, the end goal is that we would need the robot to know the locations of itself and all the objects in its environment,” notes Englot.

“To do that, it needs to know its exact location, with respect to the environment, at all times; keep track of that location with very high accuracy; and be able to plan and modify its own path in real time, avoiding collision with objects.”

2 Stevens students checking data from underwater robot on a boatStevens students John McConnell Ph.D. ‘23 and Ivana Collado-Gonzalez (right) consult data during a 2023 ROV field testThe team has set to work figuring how to help robots along on that path. Their unique innovations have come both in the form of hardware — pairing two multibeam sonar devices, oriented at 90-degree angles to each other — and novel algorithms specially created to enable an underwater robot to combine the two data streams from sonars in real time, figuring out where it is and mapping shapes and surfaces more accurately than ever before.

As the robot moves, one sonar captures horizontal range and bearing information while the other captures vertical data including elevation. Stevens-developed SLAM (simultaneous localization and mapping) algorithms combine the information into a single, nuanced point cloud that can closely map the shape of a river or sea bottom, a ship, a plane, a rock.

To develop the system, the group has mostly tested its modified ROV on campus in the historic Davidson Lab wave tank, dropping an oddly shaped and textured object into the tank and then seeing if the robot could sense and map it.

The system has also been tested in the field in three regional marinas, each posing different challenges and objects, adds Englot.

More recently, the group has worked on bringing additional accuracy to the mapping system by interpolating and evaluating “submapping” steps between individual keyframes, which are essentially image snapshots.

Those intermediate steps can heighten accuracy, the team found — particularly in complex environments where many different shapes exist in the underwater environment being mapped.

The American Society of Mechanical Engineers (ASME) was so impressed with the AI that is recently filmed the team working with the modified ROVs on campus in Davidson’s tank.

Keeping fish farms, wind farms running

What’s the end goal? The key sponsor for Stevens’ undersea-mapping project is the U.S. Department of Agriculture’s National Institute of Food and Agriculture (NIFA). NIFA is funding the work with an eye toward helping industry develop additional offshore aquaculture farms in the U.S.

An autonomous robot packed with this sort of software, Englot theorizes, could serve as a sort of 24-7 patrol on operations, routinely cleaning a farm array and also monitoring for damage.

“Wind farms are another possible application of this inspection technology,” he points out. “The sector is growing rapidly and the open ocean can be a difficult environment for humans to move around in and inspect or maintain structures in.”

The National Science Foundation, Office of Naval Research and private firm Schlumberger also sponsored portions of this work, and the Stevens team recently submitted a journal paper describing its innovations to the IEEE Journal of Oceanic Engineering.