Date 09/13/24

New research on data quality's role in model efficiency presented by Imagimob AI Engineer Gustav Nilsson

Earlier this month, at the 9th International Conference on Fog and Mobile Edge Computing (FMEC 2024), our very own AI engineer, Gustav Nilsson presented his paper: The Role of the Data Quality on Model Efficiency: An Exploratory Study on Centralized and Federated Learning. Gustav wrote this paper in collaboration with Imagimob, as part of his studies in AI and Machine Learning to obtain a Master of Science in Engineering at Blekinge Institute of Technology (BTH).

On how the paper came about, Gustav says: "the core idea of the paper was a result of discussions with people at Imagimob and they supported me throughout the whole process!"

The whole paper will be published shortly, so stay tuned as we will share the link on our blog and on our LinkedIn in the near future. For now, you can read his abstract.

Abstract

This paper investigates the impact that datasets of varying data quality levels have on centralized vs. federated learning models using experiments. We also investigate how the distribution of low-quality data across federated clients affects the models' accuracy. Within the experiments we create datasets of increasingly worse data quality in terms of the following two data quality metrics; data accuracy and data completeness. This is done by perturbing (i.e., modifying) the datasets in order to decrease the quality of the datasets with regard to these two data quality metrics. Then, three experiments are conducted that investigates; i) the impact of decreased data accuracy on the models' performance, ii) the impact of decreased data completeness, and iii) the effects of different distribution low-quality data on the clients used in the federated learning setup. The results reveal that the centralized model achieves 60.3% validation accuracy with low data accuracy and 58.7% with low data completeness. While the federated model performs better, achieving 69.3% validation accuracy with low data accuracy and 79.2% with low data completeness. The federated model is less affected by low data quality if the data quality is distributed evenly between its clients. Further, the federated learning setup displays certain attributes that make it more robust to data with low quality, compared to centralized learning. Uneven distribution of data quality between clients has a more negative impact on federated learning compared to even distribution.

Be sure to subscribe to our newsletter so you don't miss the paper in it's entirety!

LATEST ARTICLES
arrow_forward
Date 09/13/24

New research on data quality's role in model effic...

Earlier this month, at the 9th International Conference on F...

Date 09/03/24

September Release of Imagimob Studio

We're sharing the exciting new features from this month's re...

Date 07/05/24

Imagimob at tinyML Innovation Forum 2024

Date 07/01/24

Imagimob Studio 5.0 has arrived!

Date 05/13/24

May release of Imagimob Studio

Date 04/11/24

2024 State of Edge AI Report

Date 03/11/24

What is Edge AI?

Date 03/08/24

March release of Imagimob Studio

Date 02/18/24

What is tinyML?

Date 02/06/24

February release of Imagimob Studio

Date 01/16/24

Introducing Graph UX: A new way to visualize your ...

Date 12/06/23

Imagimob Ready Models are here. Time to accelerate...

Date 01/27/23

Deploying Quality SED models in a week

Date 11/17/22

An introduction to Sound Event Detection (SED)

Date 11/14/22

Imagimob condition monitoring AI-demo on Texas Ins...

Date 11/01/22

Alert Vest – connected tinyML safety vest by Swanh...

Date 10/21/22

Video recording from tinyML AutoML Deep Dive

Date 10/19/22

Edge ML Project time-estimates

Date 10/05/22

An introduction to Fall detection - The art of mea...

Date 04/20/22

Imagimob to exhibit at Embedded World 2022

Date 03/12/22

The past, present and future of Edge AI

Date 03/10/22

Recorded AI Tech Talk by Imagimob and Arm on April...

Date 03/05/22

The Future is Touchless: Radical Gesture Control P...

Date 01/31/22

Quantization of LSTM layers - a Technical White Pa...

Date 01/07/22

How to build an embedded AI application

Date 12/07/21

Don’t build your embedded AI pipeline from scratch...

Date 12/02/21

Imagimob @ CES 2022

Date 11/25/21

Imagimob AI in Agritech

Date 10/19/21

Deploying Edge AI Models - Acconeer example

Date 10/11/21

Imagimob AI used for condition monitoring of elect...

Date 09/21/21

Tips and Tricks for Better Edge AI models

Date 06/18/21

Imagimob AI integration with IAR Embedded Workbenc...

Date 05/10/21

Recorded Webinar - Imagimob at Arm AI Tech Talks o...

Date 04/23/21

Gesture Visualization in Imagimob Studio

Date 04/01/21

New team members

Date 03/15/21

Imagimob featured in Dagens Industri

Date 02/22/21

Customer Case Study: Increasing car safety through...

Date 12/18/20

Veoneer, Imagimob and Pionate in joint research pr...

Date 11/20/20

Edge computing needs Edge AI

Date 11/12/20

Imagimob video from tinyML Talks

Date 10/28/20

Agritech: Monitoring cattle with IoT and Edge AI

Date 10/19/20

Arm Community Blog: Imagimob - The fastest way fro...

Date 09/21/20

Imagimob video from Redeye AI seminar

Date 05/07/20

Webinar - Gesture control using radar and Edge AI

Date 04/08/20

tinyML article with Nordic Semiconductors

Date 12/11/19

Edge AI for techies, updated December 11, 2019

Date 12/05/19

Article in Dagens Industri: This is how Stockholm-...

Date 09/06/19

The New Path to Better Edge AI Applications

Date 07/01/19

Edge Computing in Modern Agriculture

Date 04/07/19

Our Top 3 Highlights from Hannover Messe 2019

Date 03/26/19

The Way You Collect Data Can Make or Break Your Ne...

Date 03/23/18

AI Research and AI Safety

Date 01/30/18

Imagimob and Autoliv demo at CES 2018

Date 05/24/17

Wearing Intelligence On Your Sleeve

LOAD MORE keyboard_arrow_down