Edge Intelligence : Another Fad?

Abhijeet Pokhriyal
Analytics Vidhya
Published in
5 min readFeb 12, 2021

--

What really is Edge Intelligence and how will it change ML applications.

Remember the time when Cloud computing was the buzz word. That was 2011-2012. In fact, according to Gartner, cloud computing was already sliding down into the trough at that time and is now possibly , oh no definitely in the plateau of productivity [NEWSWIRE].

Gartner Hype Cycle 2012

Fast forward a decade (yes, you and me are old and yes, you and me are a nerds) and now what you see is this new firm embrace of AI. Trend seems to be to *shuffle the deck and pull a card* and just prepend it to AI.

Responsible AI

Generative AI

Composite AI

Explainable AI

“Give me a break” AI

Let’s look at what 2020 did to us (apart from corona obviously)

An interesting thing has happened.

Edge AI or even Embedded AI is now on the peak of inflated expectations. With quite a few big guns in tech industry from NVIDIA , Google to Xilinx ; jumping onto the bandwagon. [HAMZA ALI].

What is Edge AI ?

Until recently, AI computations have almost all been performed remotely in data centers, on enterprise core appliances, or on telecom edge processors — not locally on devices. This is because AI computations are extremely processor-intensive, requiring hundreds of (traditional) chips of varying types to execute. The hardware’s size, cost, and power drain made it essentially impossible to house AI computing arrays in anything smaller than a footlocker.

Now, edge AI chips are changing all that. [DELOITTE]

Deloitte

A term that has been going along with Edge and Embedded AI is AI accelerator. These are exactly those edge AI chips that Deloitte mentions.

An AI accelerator is a class of specialized hardware accelerator or computer system designed to accelerate artificial intelligence applications, especially artificial neural networks, machine vision and machine learning. Typical applications include algorithms for robotics, internet of things and other data-intensive or sensor-driven tasks.[WIKI-AI-ACC]

In early 2019, Google released a TPU Edge processor for embedded inference application. The TPU Edge uses TensorFlow Lite, which encodes the neural network model with low precision parameters for inference , running ML models on 5V input power [GOOGLE-TPU]. And this is only ONE of the many products out there, the list is only growing and the performance both in precision and in terms of power efficiency is improving.

Deloitte

Some even claim that Moore’s law is dead and that the foreseeable future will be less about shrinking the FET and more about the sequential introduction of increasingly diverse device technologies integrated in increasingly heterogeneous computer architectures optimized for performance and energy efficiency. [Theis, T. N., & Wong]

Edge Computing has been proposed as THE way forward. The future ecosystem for computing at scale is envisioned as this decentralized , democratized apparatus, which pushes cloud services from the network core to the network edges that are in closer proximity to IoT devices and data sources [ZHAO].

The reason why Edge Computing seems so promising is that it address a few key problems that plague deploying Machine Learning on the Cloud [ZHAO]

  1. Data security and privacy

The IoT and mobile devices generate a huge amount of data, which could be privacy sensitive. Thus, it is also important to protect privacy and data security near the data source for an edge intelligence application during the model inference stage.

  1. Low connectivity

It is necessary to minimize the overhead during the DNN model inference in an edge intelligence application, particularly the expensive widearea network bandwidth usage for the cloud. Communication overhead here mainly depends on the mode of DNN inference and the available bandwidth.

2. Power constraints.

The computation and communication overheads of DNN model inference bring a large amount of energy consumption. For an edge intelligence application, energy efficiency is of great importance and is affected by the size of DNN model and the resources on edge devices.

3. Low latency requirements

For some real-time intelligent mobile applications (e.g., AR/VR mobile gaming and intelligent robots), they usually have stringent deadline requirement such as 100ms latency. Latency indicator is affected by many factors, including the resources on edge devices, the way of data transmission and the way to execute the DNN model

So is it just a FAD ?

Smart machines powered by AI chips could help expand existing markets, threaten incumbents, and shift how profits are divided in industries such as manufacturing, construction, logistics, agriculture, and energy.

The ability to collect, interpret, and immediately act on vast amounts of data is critical for many of the data-heavy applications that futurists see as becoming widespread: video monitoring, virtual reality, autonomous drones and vehicles, and more. That future, in large part, depends on what edge AI chips make possible: Bringing the intelligence to the device. [DELOITTE]

Basically that Iphone of yours, with Hexa-core (2x3.1 GHz Firestorm + 4x1.8 GHz Icestorm) and Apple GPU (4-core graphics) is beast of a machine, it is such a waste that we use it mostly for mindlessly tweeting.

Researchers though have another plan.

Drop me a line on Linkedin on what YOU think where Edge Intelligence will be in next 2–5 years

Sources:

[ZHAO](https://arxiv.org/pdf/1905.10083.pdf)

[NEWSWIRE](https://www.globenewswire.com/news-release/2020/08/21/2081841/0/en/Cloud-Computing-Industry-to-Grow-from-371-4-Billion-in-2020-to-832-1-Billion-by-2025-at-a-CAGR-of-17-5.html)

[HAMZA ALI]](https://arxiv.org/ftp/arxiv/papers/2009/2009.00803.pdf)

[GOOGLE-TPU](https://cloud.google.com/edge-tpu)

[WIKI-AI-ACC](https://en.wikipedia.org/wiki/AI_accelerator)

[DELOITTE](https://www2.deloitte.com/us/en/insights/industry/technology/technology-media-and-telecom-predictions/2020/ai-chips.html)

[Theis, T. N., & Wong, H.-S. P. (2017). The End of Moore’s Law: A New Beginning for Information Technology. Computing in Science & Engineering, 19(2), 41–50. doi:10.1109/mcse.2017.29](https://sci-hub.se/10.1109/MCSE.2017.29)

--

--

Abhijeet Pokhriyal
Analytics Vidhya

School of Data Science @ University of North Carolina — Charlotte