Living on The Edge: Why is Edge Compute So Confusing?

As the IoT and 5G have evolved the models have moved from a simple Sensor to Gateway, to Cloud, to Edge, to near Edge, to Mobile Edge hype and perhaps one to two more terms that are circulating out there in the hype cycle. The terminology is confusing and typically is market speak or used to hype a specific technology and location between the Cloud and endpoint device that will be used for the majority of the collection, compute, response, and storage of the data.

What Defines The Edge?

Practically speaking the Edge, well, is the edge.  I would argue when you are at the rim of the Grand Canyon there is only one edge, you might have a point near the edge, but unless you are descending into the canyon, there is no mobile edge.  For IoT applications, the Edge is where the sensors are located and data collection is initiated.  I would also argue this is where the first components of the compute takes place if a microprocessor is involved.  Most microprocessors, even the very basic ones, have enough compute power to execute simple instructions about what to do with the data.  There are some exceptions where sensors do not have compute capability and the data is routed to a gateway where the first compute functions take place.

What drives where the user ends up placing compute depends on what their ultimate objective is from a business perspective, system architecture, and whether, they want to spend money up front in hardware, or do they spend their money in communications and the cloud. 

Architecture Drives Edge Compute and Cost

Several years ago, I presented and wrote on how architecture drives cost using the example of a smart street lamp. Smart street lamps today range from systems that have GPS, video for parking and traffic monitoring, microphones for gunshot and accident detection, as well as for the street lamp operation, to street lamps that are monitored for location and to determine if the lamp is functioning properly. 

The example I used was for gunshot detection.  If I had my street lamps set up on a mesh network, and was using a gateway to the cloud. The street lamps would detect the noise, send the data to the gateway, the data would then move to the cloud the cloud would then perform the analysis and send the location to public safety.  There would be a significant time delay from the time the noise or gun shot was detected until a request for assistance would show up at public safety.  Potentially 5 minutes depending upon the system. 

An alternative would be to have a significant amount of compute on the lamp post itself. The lamp posts could then work together to determine if it was a gunshot, and then directly send a signal to public safety, thus potentially speeding up the response time. The processor on the lamp post then decides what data needs to be sent to the cloud at a later time. Thus, providing the needed data for analytics, and AI, but significantly reducing the communications cost of sending all the data to the cloud, or to a near edge, or mobile edge compute. 

A system with a significant amount of compute improves the latency, or speed of system response. A system like this has a much higher upfront cost as a processor with the ability to have memory for storage of data and the software to run analytics and artificial intelligence (AI) at the edge, and the compute power to perform all the functions needed in a timely manner. However, a high-performance system like this should reduce the communications cost over the lifetime of the system.

Adding to The Confusion

The topic of artificial intelligence (AI) has been hyped significantly over the past few years, and yes AI can be a key component to an IoT solution.  AI chips from Nvidia, Cerebras, and neuromorphic compute chips from Intel have added to they hype. However, in the near term it is highly unlikely that the above chips will make their way to most edge applications, due to the cost, they will most likely reside in data centers, or be placed in specific applications such as autonomous vehicles where latency or the a critical component of the end product in order to avoid, in this case an accident.

Instead it is more likely the industry will see MCU’s such as NXP’s LPC5500 series based upon ARM’s Cortex-M33 that has the capability to perform simple AI applications such as facial recognition for home security, or industrial applications.  MCU’s such as these can perform analytics and AI on the edge, and then pass the necessary data onto the cloud to develop the larger database for further analytics and more sophisticated learning.  The MCU’s will be an affordable driving force of IoT endpoints and edge compute for many applications.

Implications for Business and Technology Leaders

Architecture drives cost. When designing an IoT solution the entire ecosystem needs to be taken into account. Upfront hardware costs might seem prohibitive, but they could result in significantly lower operating costs.  Work with your IoT solutions provider to explore multiple options.

The Edge is the edge: basic compute starts at the Edge. How sophisticated the compute depends upon the application and the latency.  “How fast does your system need to react to the data it is receiving”.  In your IoT solution there will be multiple layers of compute.  The Edge, the gateway, and then the Cloud. This layering can take on different forms depending upon the desired end results.  This layering will also be one of the key cost components to an IoT solution.


Contact us if you would like a detailed briefing of our research agenda on the future of cloud and semiconductor end markets as well as to find out more about neXt Curve‘s advisory services.

Related Content & Media

by

Dean Freeman

Research Fellow, neXt Curve

September 20, 2019

This material may not be copied, reproduced, or modified in whole or in part for any purpose except with express written permission or license from an authorized representative of neXCurve. In addition to such written permission or license to copy, reproduce, or modify this document in whole or part, an acknowledgement of the authors of the document and all applicable portions of the copyright notice must be clearly referenced. 

© 2019 neXt Curve. All rights reserved.

2 comments

Leave a comment

Your email address will not be published. Required fields are marked *