Wednesday, February 25, 2015

Analyzing Plants With Google Glass

Scientists from UCLA’s California NanoSystems Institute have developed a Google Glass app that, when paired with a handheld device, enables the wearer to quickly analyze the health of a plant without damaging it.

The app analyzes the concentration of chlorophyll — the substance in plants responsible for converting sunlight into energy. Reduced chlorophyll production in plants can indicate degradation of water, soil or air quality.

One current method for measuring chlorophyll concentration requires removing some of the plant’s leaves, dissolving them in a chemical solvent and then performing the chemical analysis. With the new system, leaves are examined and then left functional and intact.

The research, led by Aydogan Ozcan, associate director of the UCLA California NanoSystems Institute and Chancellor’s Professor of Electrical Engineering and Bioengineering at the UCLA Henry Samueli School of Engineering and Applied Science, was published online by the Royal Society of Chemistry journal Lab on a Chip.

The system developed by Ozcan’s lab uses an image captured by the Google Glass camera to measure the chlorophyll’s light absorption in the green part of the optical spectrum.

The main body of the handheld illuminator unit can be produced using 3-D printing and it runs on three AAA batteries; with a small circuit board added, it can be assembled for less than $30. Held behind the leaf, facing the Glass wearer, the illuminator emits light that enhances the leaf’s transmission image contrast, indoors or out, regardless of environmental lighting conditions.
The wearer can control the device using the Google Glass touch control pad or with the voice command, “Okay, Glass, image a leaf.” The Glass photographs the leaf and sends an enhanced image wirelessly to a remote server, which processes the data from the image and sends back a chlorophyll concentration reading, all in less than 10 seconds.

“One pleasant surprise we found was that we used five leaf species to calibrate our system, and that this same calibration worked to accurately detect chlorophyll concentration in 15 different leaf species without having to recalibrate the app,” Ozcan said. “This will allow a scientist to get readings walking from plant to plant in a field of crops, or look at many different plants in a drought-plagued area and accumulate plant health data very quickly.”

The Google Glass app and illuminator unit could replace relatively costly and bulky laboratory instruments. Ozcan said that the convenience, speed and cost-effectiveness of the new system could aid scientists studying the effects of droughts and climate change in remote areas.



  1. Replies
    1. Unless you run a tissue sampling laboratory.

    2. I wonder if that could be programmed to be as useful as a tissue test.

      I would think that would be awhile yet but it sure would be handy if it did.


  2. Good one, Budde!

    I have seen the Google Glass used in dairies too, you just look at the cow, and from the ear tag or more likely RFID chip, the glasses display the data about that specific cow.
    But really, the innovation is not in the Google Glass, which is a display gadget, but in the external sensors or device, which could display the data just as easily on a phone or tablet. You don't even pay for the convenience of a hand-free solution, because you need to hold the external electronic device anyway, and you could just as well make the phone stick to it to hold only one combined device. You probably can't wear these glasses on top of your own glasses neither.

    Not sure a dedicated and most likely proprietary circuit board is the best type of device for non-invasive tissue sampling. It does not seem to account for differences across most cultivated varieties and species. A Raspberry Pi or similar small computer could be more easily programmed to adapt to these differences, download updates, or anonymized samplings from other users to compare with your own against the average, etc. But these innovations are going into the right direction anyway. We need more of them.

  3. I would think it would take thousands of samples to calibrate Google Glass and software to a Midwest tissue test.