Yale Sonar Robot Modeled After Bat and Dolphin Echolocation Behavior Robot Books

Kits and Toys
 Robot Kits
 Stiquito Kit
 BattleKits
 Robot Toys
 Solar Kits
 Robot Arms
 Robosapien
 Basic Stamp Kits
 Lego MindStorms

Books
 Beginners Books
 Hobby Robots
 Robot Sports
 Electronics
 Mechanics
 Robot Minds
 Books for Kids
 Robots at Work
 Microcontrollers
 Advanced Books
 Used Books

More Robotics
 Real Robots
 Robot Motors
 Remote Controls
 Robot Parts
 Robot Tools
 Magazines
 Robot Videos
 Robot News
 RoboLinks
 Contact

Source: Yale University--Office of Public Affairs

New Haven, CT -- A robot inspired by the ability of bats and dolphins to use echoes for locating prey is causing robotics experts to reevaluate the relative merits of sound waves versus camera vision for exploring new environments. The sonar device, which was designed and created by Yale University electrical engineering professor Roman Kuc, is so sensitive that it can tell whether a tossed coin has come up heads or tails.

"In the early days of robot design, primitive navigational sonars were often used to locate objects, but cameras were needed to identify them," says Professor Kuc, who has designed mobile robots and navigating wheelchairs equipped with ultrasound sensors during 10 years of robotics research. "In recent years, scientists have virtually abandoned sonar detection in favor of camera vision for robots, but we decided to take a closer look at how echolocation is used in nature to see if we might be missing something."

Advances in camera-vision research have reached a plateau because scientists have encountered formidable obstacles in duplicating the incredible power and subtlety of human vision, Professor Kuc says. Yale's design for sonar detection, on the other hand, could prove easier and less costly than camera vision for identifying an authorized customer at an automated teller machine, detecting production flaws on an assembly line, or helping someone who is paralyzed interact with a computer.

Called Rodolph -- short for robotic dolphin -- Yale's robot is equipped with three Polaroid electrostatic transducers that can act either as transmitters or receivers to serve as the robot's "mouth" and "ears." The transducers are similar to those used in Polaroid autofocus cameras to gauge an object's range, and in acoustic digital tape measures that use echoes to measure distances.

Attached to the end of a robotic arm, the transducer in the center emits sounds waves that bounce off objects, much like the high-pitched squeals of bats and the clicking sounds of dolphins. The robot's mouth is flanked by two rotating transducer ears that act as receivers for detecting echoes. The design is inspired by bats, whose ears react by rotating in the direction of an echo source, and by dolphins, who appear to move around in order to place an object at a standard distance, Professor Kuc explains in the August issue of the Journal of the Acoustical Society of America in the cover article.

The robot's bobbing head with its twitching ears has an eerie, animal-like quality as it scans the environment in Professor Kuc's laboratory, scrutinizing coins and distinguishing between various sizes of rubber 0-rings and ball bearings. A human hand inadvertently passing through its sonar field sets off scanning motions reminiscent of an inquisitive cat.

"The robot exploits the important biological principle of sensor mobility to place an object at a constant distance, thus reducing the complexity of object recognition," Professor Kuc says, adding that the rotating ears also help pinpoint and amplify the sound. "Then the robot can either learn a new object by adding the echoes to its memory, or identify an old object already in its data base."

Controlled by a Pentium 120 processor in a personal computer, the robot emits ultrasound pulses at 60 kilohertz as often as 10 times a second. The ears rotate separately on the moving arm, helping to position the sonar 15 centimeters from the object, plus or minus 0.1 millimeter.

Previous sonar robots required a far larger number of sonar readings from different angles and distances for object identification, but Professor Kuc's robot requires only a single reading because it can move around and scan until it arrives at a predetermined distance from the object.

The data the robot gathers during a learning process are logarithmically compressed to emphasize slight differences in structure and reduced to produce vectors containing 32 different features. Each object is represented by approximately 50 vectors that form clusters, each cluster corresponding to a particular view angle, says Professor Kuc, who uses C++ computer language for processing.

The next step is to mount the stationary robotic arm on a mobile base to enable Rodolph to explore its environment, says Professor Kuc, whose research is supported by the National Science Foundation.

 

 

Advertise your product on RobotBooks.com

Beginners Books  |  Hobby Robots  |  Robot Sports  |  Electronics  |  Mechanics  |  Robot Minds  |  Robot Fiction
Books for Kids  |  Robots at Work  |  Mars Robotics  |  Advanced Books  |  Recommended  |  Roboxers  |  Robot Kits
Solar Kits  |  Robot Arms   |  Robosapien  |  Basic Stamp  |  BioHazard  |  Robot Toys  |  Muscle Wires  |  Lego Mindstorms
Real Robots  |  Robot Motors  |  Robot Tools  |  Microcontrollers  |  Used Books  |  Robot Parts  |  Magazines  |  Holdem
Robot Videos  |  Robot News  |  RoboLinks  |  Contact