News | August 13, 2025

New Robotic Agricultural Sensor Could Revolutionize Farming

Farmers might be able to get help tending and harvesting crops using a new sensing technology from Carnegie Mellon University’s Robotics Institute (RI).

Researchers invented a tool called SonicBoom that can find crops like apples based on the sound they make. The novel technology, still in the early stages of development, may someday be used by farm robots for tasks like pruning vines or locating ripe apples hidden among the leaves.

“Even without a camera, this sensing technology could determine the 3D shape of things just by touching,” said Moonyoung (Mark) Lee, a fifth-year Ph.D. student in robotics.

The device might be the answer to a manipulation problem that has long befuddled agricultural robotics researchers. Farm workers can simply thrust their hands through the leaves toward what looks like an apple and use their sense of touch to grasp the fruit. But robots depend solely on cameras to guide their arms and manipulators, said Lee.

“One of the reasons manipulation in an agricultural setting is so difficult is because you have so much clutter — leaves hanging everywhere — and that blocks a lot of visual inputs,” Lee said. In an orchard, “the fruit itself can be partially occluded and the path the arm must take to reach it can be very occluded.”

More durable, cheaper technology
SonicBoom solves a problem that existing farming robots face — delicate and cumbersome sensors. Tiny, camera-based tactile sensors, encased in protective gel, can quickly wear out or suffer damage when in frequent contact with plants. Pressure sensors, another current option, have to be applied to large areas of the robot arm, making the approach impractically expensive.

By contrast, SonicBoom relies on contact microphones, which sense audio vibrations when they are in contact with an object rather than through the air like a conventional microphone.

Contact microphones aren’t top-of-mind for most robotics researchers, Lee said, but his adviser, RI Associate Professor Oliver Kroemer, used the devices to perform classification tasks, such as identifying the properties of materials.

How it works
The research team used an array of six contact microphones placed inside a piece of PVC pipe. When the pipe touches an object, such as a tree branch, the microphones detect the resulting vibration. By analyzing the differences in the sound waves, the researchers were able to triangulate where the contact took place. SonicBoom can localize contacts with a precision between 0.43 and 2.2 centimeters.

The PVC pipe protects the contact microphones from damage. It also gives the appearance of a microphone boom, inspiring the name SonicBoom. Ultimately, the microphones could be installed inside a robot arm.

The researchers used a data-driven machine learning module to develop the ability to map the signals from the microphones. To do so, they collected audio data from 18,000 contacts between the sensor and a wooden rod.

Using the audio data, SonicBoom determines the location of hard or rigid objects. Changing its configuration should enable it to also sense less rigid objects, such as soft fruits and vegetables, Lee said. He has also led subsequent research that explores the arrays' ability to identify the object, not just its location.

Though SonicBoom was developed for agricultural use, Lee can imagine it in other applications, such as safety devices when robots are used near people or in robots explicitly designed to interact with humans. It also could be used for applications in dark places.

In addition to Lee and Kroemer, the research team included Ph.D. student Uksang Yoo and RI faculty members Jean Oh, Jeffrey Ichnowski and George Kantor.

A report explaining SonicBoom appeared in the July issue of IEEE Robotics and Automation Letters. The research was supported by the National Science Foundation and the U.S. Department of Agriculture.

Source: Carnegie Mellon University