This app uses spectral analysis to analyze objects and their makeup

24


Normally, if the creators of an app claimed that it could look inside things and tell you their constituent parts, I would tell you to go peddle your snake oil elsewhere. But this app is from veteran R&D group Fraunhofer — so it may very well be the real thing.

The app is called HawkSpex mobile, and it performs a spectral analysis on whatever you point it at — a widely used technique, but one that requires specialized optical equipment such as a particular prism or hyperspectral camera sensor. HawkSpex does it with a normal smartphone camera. But how?!

Ordinarily, spectral analysis gear would split the light striking an object into specific wavelength ranges and look for spikes here and there that indicate the presence or absence of certain substances or elements. If you want to know whether water contains lead, for instance, you might look for reflectivity at 283.3 nanometers.

“Since hyperspectral cameras aren’t integrated in smartphones, we simply reversed this principle,” explained Udo Seiffert, who led the project, in the Fraunhofer announcement.

Instead of splitting a broad swath of spectrum into little pieces, HawkSpex uses the phone’s display to illuminate the object with lights of known wavelengths and observing the object as it reflects each one. It’s a clever workaround, assuming it works as advertised.

There are, of course, limits to what this kind of casual spectral analysis can do, but for checking whether a substance — wanted or unwanted — is present, it’s a useful tool. You could, for example, check an apple at the supermarket to see whether there are any traces of pesticides. You could check paint for lead, soil for nutrients, or wine for poison.

“There are so many conceivable uses that the market will surely overrun us,” Seiffert said, “with certainty,” added Fraunhofer. Users will be able to add data into the service by training the app on examples of it — like comparing coffee with and without caffeine.

That’s all once the app gets out of the lab, though: currently it has only been tested internally, and the team is going to load it up with a few common use cases before releasing it for public use.