0

Predicting Papaya Ripeness with Computer Vision Algorithm

NewImage

From IEEE Spectrum via Science Direct:

The University of Campinas researchers teamed up with computer scientists from Londrina State University in Londrina, Brazil to develop the machine learning approach that achieved an overall ripeness detection accuracy of 94.7 percent. Their work appears in the February issue of the journal Computers and Electronics in Agriculture.

Measuring ripeness—and identifying relevant features for ripeness—was one of the biggest challenges. The researchers started out with a government guidance chart that listed five levels of papaya ripeness. But they soon consolidated ripeness levels into three maturity levels based on visual inspection: Visually, the outer peel of the golden papayas starts out green and yellows as the fruit ripens. They further verified the three levels with additional testing based on each fruit’s pulp firmness.

Training the machine learning algorithm also proved an unexpected challenge: It required a diverse selection of papayas. Researchers had hoped to get a large number of papayas from a local producer but eventually found themselves buying 57 golden papayas at a local market in Campinas.

Both the hardware and software components of the project proved relatively straightforward. On the hardware side, researchers built a boxy contraption with a consumer digital camera and light bulbs positioned on the ceiling to take illuminated pictures of the papaya samples. Success with such consumer-grade technology means this approach could be adapted fairly readily to commercial applications.

On the software side, the researchers considered a number of different machine learning algorithms before settling upon the common random forest classifier. This approach enabled the researchers to clearly see how different papaya features factored into the machine learning algorithm’s results. “We could see which features are really providing useful information about the fruit,” Barbin explains.

A deep learning approach based on neural networks also might have yielded good results for visually identifying ripe papayas. But the Londrina State University colleagues were wary of the black box nature of deep learning algorithms that usually makes it extremely difficult to figure out how deep learning comes up with any given result. Furthermore, a deep learning approach would have required a potentially far greater sample of papayas in the training dataset to achieve reasonable accuracy.

Read more from IEEE Spectrum and Science Direct


Make a robot friend with Adafruit’s CRICKIT – A Creative Robotics & Interactive Construction Kit. It’s an add-on to our popular Circuit Playground Express, FEATHER and other platforms to make and program robots with CircuitPython, MakeCode, and Arduino. Start controlling motors, servos, solenoids. You also get signal pins, capacitive touch sensors, a NeoPixel driver and amplified speaker output. It complements & extends your boards so you can still use all the goodies on the microcontroller, now you have a robotics playground as well.

Join 7,500+ makers on Adafruit’s Discord channels and be part of the community! http://adafru.it/discord

CircuitPython in 2018 – Python on Microcontrollers is here!

Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.

Join us every Wednesday night at 8pm ET for Ask an Engineer!

Follow Adafruit on Instagram for top secret new products, behinds the scenes and more https://www.instagram.com/adafruit/


Maker Business — Fewer startups, and other collateral damage from the 2018 tariffs

Wearables — Light as a Worbla feather

Electronics — How to make your own magnetic field probe!

Biohacking — The State of DNA Analysis in Three Mindmaps

Python for Microcontrollers — One year of CircuitPython weeklies!

Get the only spam-free daily newsletter about wearables, running a "maker business", electronic tips and more! Subscribe at AdafruitDaily.com !



No Comments

No comments yet.

Sorry, the comment form is closed at this time.