It can be fun to swim with dolphins or explore the reefs to watch the fish, but when it comes down to it, more than 95 percent of the ocean is still unexplored. Much of this is because human beings can’t survive the deep depths of the ocean, so our exploration is limited to submersibles and robots — many of which scare away the local fish with their propellers.
To combat this, engineers have designed a new type of robot — one that looks, and swims, like a fish.
Meet SoFi — short for soft robotic fish. This unique little robot is 18 inches long, can swim for up to 45 minutes at depths of up to 60 feet, and can be controlled remotely by a diver using a tricked out Super Nintendo controller.
What makes SoFi different from other underwater robots is how she swims. She doesn’t rely on propellers or impellers to move her through the water. Instead, she swims like a fish, providing the researchers with a fish-eye view of how the underwater world actually works. SoFi is equipped with a hydrophone, a battery that can last up to 45 minutes on a single charge and a camera, as well as its operating and communications systems.
In another unique design choice, the electronics compartment — which would normally be filled with air to protect the equipment — is filled with nonconductive oil. This allows the fish to maintain internal pressure at depth and protects the electronic components in the event of a breach.
Sound Wave Communication
It can be difficult to see underwater, so some creatures – primarily cetaceans – communicate via soundwaves. Whales sing to one another and dolphins use clicking echolocation to find prey and each other in the dim underwater environment. Most underwater robots have to be tethered to their controller because things like radio waves and Wi-Fi don’t transmit well underwater.
Instead of being tethered to a controller, SoFi receives commands via sound waves. The waves are at a frequency that is imperceptible to fish, so they don’t scare off the very creatures the scientists are trying to study. SoFi’s team theorizes that it could possibly be heard by cetaceans, but they haven’t encountered any in the wild yet so that theory remains to be tested. The team created their own unique code to convey simple commands to SoFi remotely, even over distances as far as 60-70 feet.
Another problem with traditional robots is that they use clunky actuators to move. These actuators make noise and scare away local marine life. Instead of using traditional actuators to facilitate movement, SoFi uses two hollow chambers in its soft tail that cycle water back and forth between them. Not only does this help even out the robot’s buoyancy and keep it from floating, but it also allows for a soft, natural swimming movement that mimics the fish around it.
One concern with this more natural type of movement is that it could attract predators — but that isn’t a problem for the minds behind SoFi. According to Robert Katzschmann, the MIT graduate student who leads SoFi’s team, “If a shark would have come and ate our fish, that would have been the most amazing footage.” It also would have been a testament to their creation — that they managed to make a robotic fish so realistic, a shark or other oceanic predator thought it might make a good dinner. It wouldn’t turn out so good for the shark though – SoFi might look like a fish but she definitely doesn’t taste like one!
With so much of the ocean still unexplored, SoFi might be the first in a long line of robotic fish that provide a better look and understanding of the underwater world that makes up so much of our planet’s surface. We may not be able to explore the depths ourselves, but SoFi and other devices like it can take us there, much easier than ever before. Let’s just hope it doesn’t end up as shark bait.
Featured image credit: Joseph DelPreto, MIT CSAIL
Latest posts by Megan Ray Nichols (see all)
- Do Schools Need a Robotics Program? - 14 November, 2019
- What Are the Biggest Engineering Challenges? - 9 October, 2019
- What Does It Take for a Robot to Survive in Space? - 16 September, 2019