As seasonal allergy sufferers will attest, the concentration of allergens in the air varies every few paces. A nearby blossoming tree or sudden gust of pollen-tinged wind can easily set off sneezing and watery eyes.
But concentrations of airborne allergens are reported city by city, at best.
A network of deep learning-powered devices could change that, enabling scientists to track pollen density block by block.
Researchers at the University of California, Los Angeles, have developed a portable AI device that identifies levels of five common allergens from pollen and mold spores with 94 percent accuracy, according to the team’s recent paper. That’s a 25 percent improvement over traditional machine learning methods.
Using NVIDIA GPUs for inference, the deep learning models can even be implemented in real time, said Aydogan Ozcan, associate director of the UCLA California NanoSystems Institute and senior author on the study. UCLA graduate student Yichen Wu is the paper’s first author.
Putting Traditional Sensing Methods Out to Pasture
Tiny biological particles including pollen, spores and microbes make their way into the human body with every breath. But it can be hard to tell just how many of these microscopic particles, called bioaerosols, are in a specific park or at a street corner.
Bioaerosols are typically collected by researchers using filters or spore traps, then stained and manually inspected in a laboratory — a half-century-old method that takes several hours to several days.
The UCLA researchers set out to improve that process by monitoring allergens directly in the field with a portable and cost-effective device, Ozcan said, “so that the time and labor cost involved in sending the sample, labeling and manual inspection can be avoided.”
Unlike traditional methods, their device automatically sucks in air, trapping it on a sticky surface illuminated by a laser. The laser creates a hologram of any particles, making the often-transparent allergens visible and capturable by an image sensor chip in the device.
The holographic image is then processed by two separate neural networks: one to clean up and crop the image to focus on the sections depicting biological particles, and another to classify the allergens.
Conventional machine learning algorithms achieve around 70 percent accuracy at classifying bioaerosols from holographic images. With deep learning, the researchers were able to boost that accuracy to an “unprecedented” 94 percent.
Using an NVIDIA GPU accelerates the training of the neural networks by hundreds of times, Wu said, and enables real-time testing, or inference.
A Blossoming Solution for Real-Time Analysis
While the version of the device described in the paper transmits the holograms to a remote server for the deep learning analysis, Wu said future versions of the device could have an embedded GPU to run AI models at the edge.
For scientists, the portable tool saves money and would enable them to gather data from distributed sensors at multiple locations, creating a real-time air-quality map with finer resolution. This map could be made available online to the general public — a useful tool as climate change makes allergy season longer and more severe.
Alternatively, the device itself — which weighs a little over a pound — could be used by individual allergy or asthma sufferers, allowing them to monitor the air quality around them anytime and access the data through a smartphone.
Since the device can be operated wirelessly, it also could be mounted on drones to monitor air quality in sites that are dangerous or difficult to access.
The researchers plan to expand the AI model to sense more classes of bioaerosols and other particles — and improve the physical device so it can conduct continuous sensing over several months.