A new autonomous harvesting robot developed by Australia’s Monash University is capable of identifying, picking and dropping apples in seven seconds.
Following extensive trials in February and March, the robot was able to harvest more than 85% of all reachable apples in the canopy identified by its vision system.
Of all apples harvested, less than 6% were damaged due to stem removal.
With the robot limited to half its maximum speed, the median harvest rate was 12.6 seconds per apple, the researchers say.
In streamlined pick-and-drop scenarios, the cycle time reduced to roughly nine seconds.
Robotic harvesting of fruit and vegetables requires a vision system in order to detect and localise the produce.
To increase the success rate and reduce the damage of produce during the harvesting process, information on the shape and stem-branch joint location and orientation are also required.
To counter this problem, researchers created a state-of-the-art motion-planning algorithm to reduce harvesting time and maximise the number of apples that can be harvested at a single location.
The robot’s vision system can identify more than 90% of all visible apples seen within the camera’s view from a distance of approximately 1.2m.
The system can work in all types of lighting and weather conditions, including intense sunlight and rain, and takes less than 200 milliseconds to process the image of an apple.
The robot grasps apples with a specially designed, pneumatically-powered soft gripper, with four independently actuated fingers and a suction system that grasps and extracts apples efficiently, while minimising damage to the fruit and the tree itself.
In addition, the suction system draws the apple from the canopy into the gripper, reducing the need for the gripper to reach into the canopy, potentially damaging its surroundings.
The gripper can extract more than 85% of all apples from the canopy that were planned for harvesting.