Advertisement

Embedded developers should prepare to embrace AI

AI is proving to be an efficient way of implementing complex system behaviors, and it’s coming to embedded systems as a native capability

By Richard Quinnell, Editor for Special Projects, Technical

0618_ViewpointImage source: Shutterstock.

There has been a big push in the industry lately to facilitate the insertion of artificial intelligence (AI) into embedded systems. Some of that has been for cloud-based AI, such as Amazon’s Alexa Voice Service, but increasingly, introductions have centered on hardware-based AI. Numerous vendors have released processors and co-processors that offer accelerated or dedicated convolutional computation hardware to support neural-network software running on the edge. Embedded systems are thus poised to start executing AI algorithms without requiring a cloud connection.

This trend has important implications for embedded system developers. The growing availability of AI-enabled processors indicates that a fundamental shift in how embedded systems get programmed is on the horizon. This will be as significant a paradigm shift as was the microprocessor’s introduction some 45 years ago.

I was just entering college at about the time that Intel introduced the first commercial microprocessor, the 4004, and my first year as a working engineer saw the processor clock rate jump to a stunning 1 MHz. Before the microprocessor’s arrival, control of complex systems had required the design of inflexible ladder logic and state machines using transistors and simple logic gates. With the 4004, however, developers could create hardware designs that could be configured and repurposed simply by changing the contents of a memory device. Suddenly, hardware design simplified and system design speeded up even as faster, more complex behaviors became practical to implement. Replacing dedicated logic with software code became the central approach to embedded system design, and developers slow to catch the wave found themselves becoming sidelined into niche applications.

AI has the potential to do the same thing to today’s embedded system designers. System operations have become so demanding and their behaviors so complex that ease of programming a processor has become more important than its performance characteristics. And demand for complex system behaviors has continued to increase, threatening to outstrip developers’ abilities to implement reliable code to meet the requirements.

This is where AI is stepping in. Rather than try to figure out an algorithm that implements a desired task, a developer can use AI to “teach” the embedded system its task. Writing code that would allow a camera system to reliably detect all and only the human faces in its field of view is a daunting and error-prone task if working with traditional embedded programming techniques. With an AI system, however, stunning results can be achieved rapidly by training a hosted AI to do the task, then implementing the resulting neural net on an AI-enhanced processor.

This leaves traditional embedded developers with the same kind of choice that the logic designers of the mid-’70s faced: Learn the new approach or risk being left on the sidelines.

Happily, there are some differences in this situation. Developers need not become AI experts to apply AI to their design tasks, just as they do not have to know how to write machine code to program the microprocessor. Developers simply must know how to work the AI tools and learn when to apply which approach to address their design goals.

Still, that will require developers to acquire new skills and develop a new way of thinking. They will need to learn about topics such as convolutional neural networks, machine learning, methods for training AI, and how to partition system behavior into AI and algorithmic domains. They will also need to know how to test, debug, and validate system behavior when its key operating characteristics are derived, not defined, and its intermediate operations are hidden so that the system is not amenable to traditional software analysis.

AI systems work by setting weights for neural networks to achieve a desired result and may contain hidden layers. Both attributes make them difficult to analyze and debug using traditional methods.

Furthermore, the situation will continue to compound. For now, embedded AI is mostly going to be implementations of relatively fixed networks defined during an earlier training stage. But as edge processing of AI continues increasing in performance and dropping in cost, future embedded AI systems will increasingly be designed to be self-training. So embedded developers will be building systems that start out relatively simple and undifferentiated but will evolve along paths unique to their specific installation over time while still needing maintenance, debugging, and upgrades from the parent company.

AI will trigger a wholesale shift in the way that embedded systems get designed, and developers should begin preparing now. The shift won’t happen overnight, and it will not affect all embedded systems the same way, so there will be time for the industry to adapt. But the shift seems inexorable and likely to affect nearly every application space over the long haul, just as the move from logic design to processor programming did over the last 40 years. Those who want their designs to always reflect the state of the art, then, need to start learning how to embed AI.

Advertisement



Learn more about Electronic Products Magazine

Leave a Reply