New app checks fruit and vegetables for chemical residues

A new app is enabling users to display the ingredients of an object. The possibilities for this technology are endless, including checking for chemical residues on fruit at the shop with a smart phone.

The ‘HawkSpex mobile’ app, created by Fraunhofer Institute for Factory Operation and Automation IFF in Magdeburg, is planned for release later this year and will allow consumers to take their smart phone, open the app, aim the camera at an object, like a strawberry, and get the relevant information, including whether there are any chemical residue on the strawberry.

There are already apps available that can provide this information, however the user needs to equip the phone with extra parts, including a prism in front of the camera, which is both expensive and impractical. With this new app, however, users will only need their camera, which is already integrated into their smart phone.

Photo credit:  Fraunhofer IFF

Photo credit: Fraunhofer IFF

No hyperspectral-imaging camera required

How did the engineers manage this feat without a prism? Scanning like this usually requires a special hyperspectral camera. It can adjust each time to the different colored light and measures how much light and which color is reflected by an object. This way the camera generates a complete spectral image of the object. The spectral image is combined with a mathematical model to extract a wide range of information about an object, for example its constituents.

Hyperspectral cameras aren’t integrated in smart phones, so the team simply reversed this principle.  The camera is a three-way sensor, one that scans every wavelength and illuminates an object with different colored light.  Instead of the camera measuring the intensity of different colors, the app uses the phone display to successively illuminates the object with series of different colored light each for a fraction of a second. When the display shines only red light on the object, the object can only reflect red light – and the camera can only measure red light. Smart algorithms enable the app to compensate for the limited computing power of the phone, as well as the limited performance of the camera and display.

Users will make their own adjustments – just like Wikipedia

The company behind the app have identified that there are so many possible uses with the technolgy. That is why the development engineers are relying on an approach modelled on the online encyclopedia Wikipedia. “Once the app is available, active users will be able to contribute and create new applications, for instance, a test function that checks the pesticide exposure of lettuce. They teach the system how to deal with such problems,” says Seiffert. They will use the app to scan different types of treated and untreated lettuce and send the data to the Fraunhofer IFF. Engineers verify the measurements and release the function to all users.

The app has extremely interesting commercial potential and can be used in sectors that wouldn’t benefit from high precision scanners. The list of possibilities is nearly endless; it can be use for food, cosmetic products or even in agriculture. Farmers, for instance, could easily find out if their crops have sufficient nutrients or if fertilizer is needed.