Advertisement

How AI changes the future of edge computing

While it makes sense to incorporate AI with edge computing, hardware and software components need to address several challenges including power consumption, processing capability, data storage, and security

By John Koon, contributing writer

As the number of internet of things (IoT) devices — such as mobile phones, virtual assistants, laptops, tablets, building sensors, drones, security cameras, and wearable health sensors — heads toward exceeding 70 billion by 2025, according to Statista , edge-computing applications will also increase. The worldwide number of artificial intelligence (AI) edge devices is forecast to jump from 161.4 million units in 2018 to 2.6 billion units by 2025, according to Tratica .

IoT devices have numerous and diverse applications in a broad spectrum of sectors, such as retail, health care, industrial, aerospace, defense, transportation, facility maintenance, energy, manufacturing, supply chain logistics, and smart cities. Each IoT device collects data continuously, which needs to be analyzed quickly to reach real-time decisions, especially for applications like autonomous cars, electric grids, remote surgeries, oil rigs, or even military drones.

network-IoT


Edge computing versus cloud computing for IoT devices
Traditionally, cloud computing is the model for IoT device analytics and prediction. In the central cloud computing model, the data is sent from the end-user device (the “edge”) to the cloud for analysis; then, the decision is transmitted back to the device for implementation. While the data centers in the central computing model have an immense capacity to process and store data, they are expensive and power-intensive to maintain.

Data transfer between the edge and the cloud is not only expensive, it is time-consuming and causes latency (lag time). In addition, the energy required for data transfer exceeds what low-energy wireless IoT devices can support. Nor does it make logistical, operational, or financial sense to transfer all data to the cloud when only a fraction of what is collected may prove useful. Lastly, data transfer may have an adverse effect on data integrity and security.

In contrast, with edge computing, data is collected and analyzed at the IoT devices for quick inference (or decision making). Later, the small amount of useful data will be moved to the cloud. Edge computing offers several advantages. Because there is no need to transfer data from the IoT devices to the central cloud, the resulting lag time, consumption of bandwidth, and cost will be low and decisions can be reached quickly based on data analytics.

In addition, edge computing can continue to run even when the system is offline, and immediate data processing makes it easier to determine which data should be transferred to the cloud for further analysis.

Developing the AI edge: challenges
While it makes sense to incorporate AI with edge computing, the hardware and AI software components face multiple challenges.

The first challenge is processing and power consumption. AI consists of training and inferencing software. Training teaches a model to identify the relevant parameters so that it can interpret data. Inferencing is when the model makes learning-based predictions.

In cloud computing, the energy-intensive training occurs at the cloud; then the trained software is deployed to the edge for the relatively low-energy task of prediction (or inference). In edge computing, the training shifts to the edge, putting more demand on the edge hardware’s processing capability. For IoT devices, this increased energy consumption poses a bigger problem, requiring a rebalance of processing capacity versus the power requirement.

Data storage and security present the second challenge now that the edge device will hold onto most of the data and transfer only a small fraction to the cloud. In addition, the device needs to store the parameters for learning and inference. A third challenge is the sheer number of IoT devices and the current lack of security standards for them.

Therefore, tech companies need to develop hardware that has more processing power and lower energy consumption along with software that performs learning and inferencing more efficiently. Also, the application of IoT is scenario- and sector-specific, making a robust ecosystem and developer environment for customization vital.

Developing the AI edge: progress
The companies, large and small, that focus on IoT edge hardware include BrainChip  (Akida Neuromorphic System-on-Chip), CEVA (the NeuPro family ), Google (Edge TPU ), GreenWave (AI processor GAP8 ), Huawei (Ascend Chips), Intel (Xeon), NVIDIA (Jetson TX2), Qualcomm (Vision Intelligence Platforms), and STMicroelectronics (STM32 microcontrollers ).

Smaller companies tend to focus on IoT edge software. Some focus on learning like Ekkono, FogHorn, and Swim (a cloud-based POS), while others target inference such as Renesas (e-AI ). Many companies also develop software with both capabilities, such as Amazon (AWS Greengrass ML Inference model), BrainChip (Studio Software), Google (Cloud IoT Edge), Huawei (Atlas platform), and IBM (Watson IoT Platform).

Large tech companies are in the best position to build ecosystems to empower developers to create industry- and scenario-specific solutions. These companies include Google (AI platform), Huawei (MindSpore), IBM (Watson ), Intel (AI Developer Program ), and Microsoft (Azure )and enterprise IoT building blocks like IoT Hub, Azure Databricks, ML Studio, and Power BI).

However, there are smaller companies that are creating ecosystems, such as BrainChip’s Akida Development Environment . In addition, trade groups such as the OpenFog Consortium and open-source projects including Living Edge Lab, ETSI Multi-access Edge Computing, and EdgeX Foundry are contributing to the ecosystem. In addition, there is a lot of collaboration in the industry among leading players, including Qualcomm, Microsoft, and Intel, which are working with partners in a variety of sectors.

Conclusion
With specialized hardware, software, and developer environments, edge computing is likely to increase operational reliability, enable real-time predictions, and improve data security. 5G, which promises lower latency and enhanced coverage and responsiveness, and quantum computing, which accelerates computation, may further increase edge computing’s efficiency.

However, the efficient distribution of the processing needs across the network of edge devices will be a challenge. Also, efficient scheduling of tasks will become essential in avoiding system failure and optimizing machine learning. Over time, it is expected that more powerful processing chips with lower power requirements will be available, and then AI-based edge computing will really shine.

This article is part of AspenCore Media’sdeep dive into the application of AI at the edge, looking beyond the voice and vision systems that have garnered much of the press. For further insights into the hardware, implementations, and implications of AI beyond voice and vision, check out these other articles in the Special Project.

Innovations Pushing AI to the Edge — AI will allow developers to implement more complex embedded system behaviors, and new tools are allowing more developers to implement AI.

Will AI come to the test industry? — Artificial intelligence and machine learning are making incremental inroads into semiconductor test, but not into functional system test, at least not yet.

Hardware helping move AI to the edge — What type of processing power is required to move AI to the edge? Vendors have come up with a range of answers.

AI makes data storage more effective for analytics — Turning data into intelligence requires analysis. AI implemented in the storage controller can substantially speed that analysis, as this proof-of-concept demonstration shows.

Connectivity remains central to mainstreaming AI, machine learning workloads  — There is a race to make AI results relevant, reliable, and readily available. Only those with AI models trained on the best machine/deep learning infrastructure, from the largest data sets, will survive.

Advertisement



Learn more about Electronic Products Magazine
Renesas Electronics America

Leave a Reply