Maker duo uses the NVIDIA Jetson platform to create smart assistant mirror, human movement tracker, automatic dart game scorekeeper and more.
by ANGIE LEE
Whether you want to know if your squats have the correct form, you’re at the mirror deciding how to dress and wondering what the weather’s like, or you keep losing track of your darts score, the Smells Like ML duo have you covered — in all senses.
This maker pair is using machine learning powered by NVIDIA Jetson’s edge AI capabilities to provide smart solutions to everyday problems.
About the Makers
Behind Smells Like ML are Terry Rodriguez and Salma Mayorquin, freelance machine learning consultants based in San Francisco. The business partners met as math majors in 2013 at UC Berkeley and have been working together ever since. The duo wondered how they could apply their knowledge in theoretical mathematics more generally. Robotics, IoT and computer vision projects, they found, are the answer.
The team name, Smells Like ML, stems from the idea that the nose is often used in literature to symbolize intuition. Rodriguez described their projects as “the ongoing process of building the intuition to understand and process data, and apply machine learning in ways that are helpful to everyday life.”
To create proofs of concept for their projects, they turned to the NVIDIA Jetson platform.
“The Jetson platform makes deploying machine learning applications really friendly even to those who don’t have much of a background in the area,” said Mayorquin.
Their Favorite Jetson Projects
Of Smells Like ML’s many projects using the Jetson platform, here are some highlights:
SpecMirror — Make eye contact with this AI-powered mirror, ask it a question and it searches the web to provide an answer. The smart assistant mirror can be easily integrated into your home. It processes sound and video input simultaneously, with the help of NVIDIA Jetson Xavier NX and NVIDIA DeepStream SDK.
ActionAI — Whether you’re squatting, spinning or loitering, this device classifies all kinds of human movement. It’s optimized by the Jetson Nano developer kit’s pose estimation inference capabilities. Upon detecting the type of movement someone displays, it annotates the results right back onto the video it was analyzing. ActionAI can be used to prototype any products that require human movement detection, such as a yoga app or an invisible keyboard.
Shoot Your Shot — Bring a little analytics to your dart game. This computer vision booth analyzes dart throws from multiple camera angles, and then scores, logs and even predicts the results. The application runs on a single Jetson Nano system on module.
Where to Learn More
In June, Smells Like ML won second place in NVIDIA’s AI at the Edge Hackster.io competition in the intelligent video analytics category.
For more sensory overload, check out other cool projects from Smells Like ML.
Discover tools, inspiration and three easy steps to help kickstart your project with AI on our “Get AI, Learn AI, Build AI” page.