Date 07/01/19

Edge Computing in Modern Agriculture


Modern agriculture is an industry segment that has come far in terms of digitization and process automation. A wide range of new technologies for crop and livestock management related to sensors, actuators, and machine learning are now available for use in daily operations. However, there are some hurdles to overcome in order to make the benefits of edge computing in agriculture more accessible for everyone.

The future of agritech is restricted by accessibility issues 
When it comes to deploying modern AI technology in rural areas, and realizing related business opportunities, there are two main challenges: a lack of wireless coverage, and deficiencies in data throughput.

For low-intensity data types—such as temperature values, humidity, and barometric pressure—established low-powered wide area networks, LPWAN (e.g., Sigfox, or LoRa), and operator-based networking, like Narrowband IoT, will do the job. But for sensing movement, vibrations, or behavior, sending raw sensor data to the cloud for processing is a no-go. Reading from a single sensor data source (e.g., an accelerometer or gyroscope) at practical frequencies easily produces 20–100 times more data than what can possibly be sent to the cloud for analysis over a LPWAN network.

Edge computing keeps data close to the source
This is where edge computing and Imagimob’s Edge AI solution comes in. Edge computing not only allows incoming data to be analyzed close to the source with a minimal footprint, it makes it easier for refined data – or the results – to be sent over narrow-band networks. 

This opens up a slew of new business opportunities. Imagine the benefits for farmers when they have the ability to monitor both stationary and mobile assets over large areas with cost-efficient hardware and connectivity.

3 steps for the best edge computing results 
The procedure necessary for producing reliable results using edge computing techniques can be carried out in three steps, which are common in many machine learning applications:

1. Aggregate 
Data is aggregated in a method where subject matter is concurrently monitored for motion and then labeled along a common timeline.

2. Train
AI models are trained to classify a defined set of motion samples.

3. Compile
A designated binary library is compiled to disseminate a set of activities during a given time interval. Each sequence ends with the transmission of a seat of events, identified activities, and allocated time for each as a percentage aggregated over the given interval (e.g., ten minutes, an hour, a working day), depending on the application at hand.

Farmers benefiting from Imagimob Edge AI
In a current project, running 2018-2021, we’re collaborating with farmers in Sweden, Finland, and Spain to investigate how motion sensors can be used to learn more about the way beef and dairy cows behave both as individuals and as part of the herd. For example, we’re able to detect which animals are active or still, sick or in good health, or dominant or submissive in order to adapt elements of their care, including nourishment, living conditions, medication and special treatment.
 
In dairy farming, the business case is clear: a missed insemination window (12-18 hours) implicitly means no milk for the next period. A report by Penn State University [O’Conner 2016], states that approximately half of these insemination windows remain undetected on dairy farms in the United States. In addition, research on the levels of the hormone progesterone in milk shows that up to 15 percent of the cattle presented for insemination are not actually in heat. By using special motion analysis, we’re able to get clearer indications of whether livestock are in estrus (heat), calving, eating, or ruminating. Some examples of detectable proestrus behavioring cows include certain movement patterns such as restlessness, rubbing, and trailing with the intent to stand.
 
Another related case in this project is one that monitors the positions and operational modes of vehicles and other mobile assets. When low-cost sensor hardware and long battery life are combined with the ability to analyze data on the edge, it becomes economically viable to monitor a vast range of assets—beyond those equipped with native positioning and communications systems like GPS-controlled agricultural tractors and harvesters. When farmers don’t need to spend time or fuel looking for animals or equipment, efficiency increases and they can run their operations with fewer staff or extra farm hands.
 
Imagimob enables the future of Edge AI 
Here at Imagimob, it’s our aim to make our technology available in the form of standardized modules in order to enableb the analysis of rich data on the edge. We especially shine in the presence of restrictions, such as electric power and data connectivity, and when quick response time is critical (without the turn-around time typically required for common cloud services). Our system for Edge AI applications gives small embedded devices big intelligence and computing power while maintaining a small footprint.

Imagimob's Oscar Sverud discussing with the driver fertilizing the field using a GPS assisted tractor.

This work is related to the aFarCloud project, receiving funding from the ECSEL Joint Undertaking (JU) and the European Union’s Horizon 2020 research and innovation programme.

Want to learn more? Feel free to contact us for more information.

LATEST ARTICLES
arrow_forward
Date 11/05/24

November release of DEEPCRAFT™ Studio

Imagimob Studio is now DEEPCRAFT™ Studio! Just last wee...

Date 09/13/24

New research on data quality's role in model effic...

Earlier this month, at the 9th International Conference on F...

Date 09/03/24

September Release of Imagimob Studio

Date 07/05/24

Imagimob at tinyML Innovation Forum 2024

Date 07/01/24

Imagimob Studio 5.0 has arrived!

Date 05/13/24

May release of Imagimob Studio

Date 04/11/24

2024 State of Edge AI Report

Date 03/11/24

What is Edge AI?

Date 03/08/24

March release of Imagimob Studio

Date 02/18/24

What is tinyML?

Date 02/06/24

February release of Imagimob Studio

Date 01/16/24

Introducing Graph UX: A new way to visualize your ...

Date 12/06/23

Imagimob Ready Models are here. Time to accelerate...

Date 01/27/23

Deploying Quality SED models in a week

Date 11/17/22

An introduction to Sound Event Detection (SED)

Date 11/14/22

Imagimob condition monitoring AI-demo on Texas Ins...

Date 11/01/22

Alert Vest – connected tinyML safety vest by Swanh...

Date 10/21/22

Video recording from tinyML AutoML Deep Dive

Date 10/19/22

Edge ML Project time-estimates

Date 10/05/22

An introduction to Fall detection - The art of mea...

Date 04/20/22

Imagimob to exhibit at Embedded World 2022

Date 03/12/22

The past, present and future of Edge AI

Date 03/10/22

Recorded AI Tech Talk by Imagimob and Arm on April...

Date 03/05/22

The Future is Touchless: Radical Gesture Control P...

Date 01/31/22

Quantization of LSTM layers - a Technical White Pa...

Date 01/07/22

How to build an embedded AI application

Date 12/07/21

Don’t build your embedded AI pipeline from scratch...

Date 12/02/21

Imagimob @ CES 2022

Date 11/25/21

Imagimob AI in Agritech

Date 10/19/21

Deploying Edge AI Models - Acconeer example

Date 10/11/21

Imagimob AI used for condition monitoring of elect...

Date 09/21/21

Tips and Tricks for Better Edge AI models

Date 06/18/21

Imagimob AI integration with IAR Embedded Workbenc...

Date 05/10/21

Recorded Webinar - Imagimob at Arm AI Tech Talks o...

Date 04/23/21

Gesture Visualization in Imagimob Studio

Date 04/01/21

New team members

Date 03/15/21

Imagimob featured in Dagens Industri

Date 02/22/21

Customer Case Study: Increasing car safety through...

Date 12/18/20

Veoneer, Imagimob and Pionate in joint research pr...

Date 11/20/20

Edge computing needs Edge AI

Date 11/12/20

Imagimob video from tinyML Talks

Date 10/28/20

Agritech: Monitoring cattle with IoT and Edge AI

Date 10/19/20

Arm Community Blog: Imagimob - The fastest way fro...

Date 09/21/20

Imagimob video from Redeye AI seminar

Date 05/07/20

Webinar - Gesture control using radar and Edge AI

Date 04/08/20

tinyML article with Nordic Semiconductors

Date 12/11/19

Edge AI for techies, updated December 11, 2019

Date 12/05/19

Article in Dagens Industri: This is how Stockholm-...

Date 09/06/19

The New Path to Better Edge AI Applications

Date 07/01/19

Edge Computing in Modern Agriculture

Date 04/07/19

Our Top 3 Highlights from Hannover Messe 2019

Date 03/26/19

The Way You Collect Data Can Make or Break Your Ne...

Date 03/23/18

AI Research and AI Safety

Date 01/30/18

Imagimob and Autoliv demo at CES 2018

Date 05/24/17

Wearing Intelligence On Your Sleeve

LOAD MORE keyboard_arrow_down