Fish

Swim like a fish, sense like a robot

Researchers in the Penn State Department of Mechanical Engineering have been awarded $426,285 from the United States Army Research Office (ARO) to probe how to use fish-inspired robots to sense and perceive underwater environments.“Swimming might seem very simple in the animal world, but the movements and sensing are extremely complex,” said Bo Cheng, the Kenneth K. and Olivia J. Kuo Early Career Professor of Mechanical Engineering. “We want to find out how fish are able to use the fluid dynamics that surround them in the water not only for swimming, but also for perception.”These findings have the potential to enhance biologically inspired robotics that can glean more information in in shallow, murky underwater environments where visual and sonar systems are difficult to use.“The idea is inspired by what animals can do,” Cheng said. “Fish and other aquatic animals are able to sense the fluid around them and tell what’s going on. In the long run, we want to develop a very systematic framework to mimic how they sense changes and control their motion efficiently to move within their fluid environment.”Along with Asok Ray, distinguished professor of mechanical engineering and mathematics, Cheng is leading the three-year project.“Essentially, we will use sensors around the fins and bodies of swimming robots and these sensors will measure the pressure and the forces acting upon the surfaces of these robots,” Cheng said. “The key scientific question will be: with these measurements, how can we extract information about the fluid environment?”For instance, fish can perceive the smallest changes in pressure, such as a fisherman casting a line on the surface of the water.“Fish are able to detect these small vibrations or any changes in the fluid state very quickly and react,” Cheng said. “That is something we want to achieve with robots.”To gather this information, the team will combine two machine learning systems, the Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM). By coupling these deep learning techniques, the researchers will be able to extract both temporal and spatial measurements to detect other objects and simultaneously control the motion of the robot.“We are planning to collaborate with the corporate research center of General Electric on this project. This collaborative research will focus on a synergistic combination of different aspects of artificial intelligence and machine learning in the setting of neural networks (NN),” Ray said. “A long-term goal is to develop NN-based methodologies for solving nonlinear partial differential equations to provide faster and more accurate solutions as compared to the state-of-the-art numerical techniques.”The research will also probe whether fish use specific movements to improve their sensing capabilities.“For instance, we want to know, if the fish does something like flaps its tail, does that self-generated motion actually enhance their sensing capability or does it decrease it?” Cheng said.With funding from the ARO, these insights into nature have the potential to one day benefit military operations that would occur in shallow water environments, such as search and rescue missions.“Dr. Cheng’s proposed research provides a novel method for extracting information from the fluid flow that could enable robotic systems to work effectively in murky environments where current optical and acoustic sensors are not adequate,” said MaryAnne Fields, program manager, ARO, an element of the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory. “This work will benefit not only the underwater applications that Dr. Cheng describes, but it may enable aerial robots to maneuver in complex, poorly lit environments by using pressure differences to sense and react to nearby entities.”Looking ahead, the Penn State team hopes that developing the fundamental knowledge in this area will guide and inform the technology for effective underwater robots.“But before we can replicate it, we need to understand it,” Cheng said.The material in this press release comes from the originating research organization. Content may be edited for style and length. Have a question? Let us know.
Read More

Show More

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button
Close
Close