Date 01/07/22

How to build an embedded AI application


We live in a world of data. Virtually everyone talks about data and the potential value that we can extract from it. But raw, massive amounts of data is complex and hard for us to interpret. Machine learning techniques have during the past few years made it possible to understand this data better and be able to leverage upon it. Whereas most of this value so far has been added in online businesses it now also starts to enter the physical world, where the data is generated by sensors. 

But the path from sensor data to an embedded AI model seems almost insurmountable. Writing embedded software is known to be very time consuming and take at least 10-20 times longer than desktop software development. In our new white paper we will walk you through a real AI project from data collection to the embedded application and share some important findings that we made along the way.

Machine learning on the Edge

Today the vast majority of signal processing software that processes and interprets sensor data is based on traditional methods: transforms, filters, statistical analyses etc. It is designed by a human that by using his/her domain knowledge is looking for some particular “fingerprint” in the data. Quite often this fingerprint is a complex combination of events in the data, and machine learning is needed to successfully resolve the problem.

To be able to process the sensor data in real time, the machine learning model needs to run locally on the chip close to the sensor itself - usually called “the edge”. In this paper we will go through how a machine learning application can be created - from the initial data collection to the final embedded application. As an example we will demonstrate a joint project between Imagimob and the radar technology company Acconeer.

Embedded AI application project: Gesture detection using radar

Imagimob and Acconeer teamed up in 2019. Both companies focus on providing powerful applications on hardware devices where low power consumption is key. The cooperation therefore represents a good match in terms of creating radically new and creative embedded applications.

Acconeer's A111 Pulsed Coherent Radar (PCR) sensor is optimized for high precision, measurement and detection - with a small footprint and ultra-low power consumption. The XM122 IoT module - where the sensor is integrated on a module optimized for IoT - is the world's first radar system that can run on a coin-cell battery.

The radar signal contains information about relative distance to and speed of reflecting objects and many of radar use cases do not need AI/ML to interpret the data from the radar sensor. But the radar signal is complex to interpret, so for more advanced services, such as detection of complex micro gestures, AI/ML is necessary in order to get good results.

The goal for this project was to create an embedded application that could classify five different hand gestures in real time using radar data. With its small size the radar could be put inside a pair of earphones and the gestures would work as virtual buttons to steer the functionality that is usually programmed into physical buttons. The end product for the project was decided to be a robust live demo at CES 2020 in Las Vegas.

Download the complete story

Read the complete story in our White Paper – From data collection to embedded gesture detection library. Fill out the form below to get access. 

LATEST ARTICLES
arrow_forward
Date 11/26/24

Delivering world class edge AI - watch the video

DEEPCRAFT™ Edge AI software solutions make it easier and fas...

Date 11/05/24

November release of DEEPCRAFT™ Studio

Imagimob Studio is now DEEPCRAFT™ Studio! Just last wee...

Date 09/13/24

New research on data quality's role in model effic...

Date 09/03/24

September Release of Imagimob Studio

Date 07/05/24

Imagimob at tinyML Innovation Forum 2024

Date 07/01/24

Imagimob Studio 5.0 has arrived!

Date 05/13/24

May release of Imagimob Studio

Date 04/11/24

2024 State of Edge AI Report

Date 03/11/24

What is Edge AI?

Date 03/08/24

March release of Imagimob Studio

Date 02/18/24

What is tinyML?

Date 02/06/24

February release of Imagimob Studio

Date 01/16/24

Introducing Graph UX: A new way to visualize your ...

Date 12/06/23

Imagimob Ready Models are here. Time to accelerate...

Date 01/27/23

Deploying Quality SED models in a week

Date 11/17/22

An introduction to Sound Event Detection (SED)

Date 11/14/22

Imagimob condition monitoring AI-demo on Texas Ins...

Date 11/01/22

Alert Vest – connected tinyML safety vest by Swanh...

Date 10/21/22

Video recording from tinyML AutoML Deep Dive

Date 10/19/22

Edge ML Project time-estimates

Date 10/05/22

An introduction to Fall detection - The art of mea...

Date 04/20/22

Imagimob to exhibit at Embedded World 2022

Date 03/12/22

The past, present and future of Edge AI

Date 03/10/22

Recorded AI Tech Talk by Imagimob and Arm on April...

Date 03/05/22

The Future is Touchless: Radical Gesture Control P...

Date 01/31/22

Quantization of LSTM layers - a Technical White Pa...

Date 01/07/22

How to build an embedded AI application

Date 12/07/21

Don’t build your embedded AI pipeline from scratch...

Date 12/02/21

Imagimob @ CES 2022

Date 11/25/21

Imagimob AI in Agritech

Date 10/19/21

Deploying Edge AI Models - Acconeer example

Date 10/11/21

Imagimob AI used for condition monitoring of elect...

Date 09/21/21

Tips and Tricks for Better Edge AI models

Date 06/18/21

Imagimob AI integration with IAR Embedded Workbenc...

Date 05/10/21

Recorded Webinar - Imagimob at Arm AI Tech Talks o...

Date 04/23/21

Gesture Visualization in Imagimob Studio

Date 04/01/21

New team members

Date 03/15/21

Imagimob featured in Dagens Industri

Date 02/22/21

Customer Case Study: Increasing car safety through...

Date 12/18/20

Veoneer, Imagimob and Pionate in joint research pr...

Date 11/20/20

Edge computing needs Edge AI

Date 11/12/20

Imagimob video from tinyML Talks

Date 10/28/20

Agritech: Monitoring cattle with IoT and Edge AI

Date 10/19/20

Arm Community Blog: Imagimob - The fastest way fro...

Date 09/21/20

Imagimob video from Redeye AI seminar

Date 05/07/20

Webinar - Gesture control using radar and Edge AI

Date 04/08/20

tinyML article with Nordic Semiconductors

Date 12/11/19

Edge AI for techies, updated December 11, 2019

Date 12/05/19

Article in Dagens Industri: This is how Stockholm-...

Date 09/06/19

The New Path to Better Edge AI Applications

Date 07/01/19

Edge Computing in Modern Agriculture

Date 04/07/19

Our Top 3 Highlights from Hannover Messe 2019

Date 03/26/19

The Way You Collect Data Can Make or Break Your Ne...

Date 03/23/18

AI Research and AI Safety

Date 01/30/18

Imagimob and Autoliv demo at CES 2018

Date 05/24/17

Wearing Intelligence On Your Sleeve

LOAD MORE keyboard_arrow_down