Advertisement

Techniques to improve embedded system security

Techniques to improve embedded system security

Some known security issues can be avoided with careful software design

BY: SHAWN A. PRESTRIDGE
Senior Field Applications Engineer
IAR Systems
www.iar.com

Security of information is becoming an increasing concern with the advent of high-profile breaches in systems such as Sony’s PlayStation network, PBS, and NASA. While the details of most of these breaches are closely guarded by the victims, it is clear that incorporating security into an embedded project must be part of the design process and not left as an afterthought.

Reliability opens security hole

Sometimes security can be compromised by design decisions that are made to increase reliability. Embedded systems may have the requirement to have dual or triple redundancy, but anytime redundant and CRC information becomes part of a key or negotiation process, a hacker has the ability to use this as a toehold to glean information about the underlying system. If a flaw is found in the security of a system that causes it to have some measure of side-channel information leakage, then the redundancy can increase the leakage of information. A famous example of this is the Enigma machine that was used by the Axis powers during World War II to encipher messages to their troops. A design flaw in the machine would never allow the letter that was typed on the keyboard to remain unchanged, so by using commonly-used phrases and weather reports that the Allies knew were going to be broadcast from certain Axis stations, the daily code settings were able to be gleaned. Side-channel attacks examine the behavior of the system as a whole to see how it reacts to certain inputs.

Famous examples of side-channel attacks include timing attacks, power monitoring attacks, and differential fault analysis. Care must be taken to ensure that the redundancy implemented for improved reliability does not increase the amount of side-channel information that a hacker can use to make inferences on the system. While side channel leakage is difficult to spot during the design process of a system, a prudent engineer must think about this factor in advance and plan accordingly.

Buffer overflows

Another thing for developers to consider is bounds-checking. One of the biggest exploits to hit Windows XP happened several years ago and was based upon buffer overflows, where filling up a buffer to its capacity and then adding a series of assembly instructions as the buffer overflowed allowed a hacker to gain root access to a system. After the initial finding of such a flaw, it seemed a new buffer overflow exploit was found every week in some of aspect of windows. This culminated in a top-down redesign of the security mechanism in Windows for Vista. Bounds-checking should be part of any testing suite for an embedded design and will force the developers to test the size of data that is input to their systems.

Most errors in software generally occur around boundary conditions, be it as one part of an algorithm gives way to another part when a boundary is crossed or when unexpected data in input into the system. Part of the testing procedure should concern a comprehensive list of buffers and algorithms and their associated boundaries so that test cases can be crafted to probe the boundaries. Close analysis of these boundary conditions can ensure that designs are hardened against this type of attack.

Techniques to improve embedded system security

Software design tools like IAR Embedded Workbench help plug security holes

Exceptions

Another cause of security weakness can be found in exception handling (or the lack thereof). Projects are generally on very tight schedules, so engineers may skimp a bit on writing exception handling within their code, preferring instead to rely upon the intelligence of the end user to not enter an unexpected input. Oftentimes this is done with the best of intentions to code this in later, before the code goes into the test and fix cycle. In other cases, there may be rudimentary exception handling in the system, but it is not rigorous enough. In either case, a hacker may be able to exploit this in order to make the microcontroller enter an undefined or privileged state; and once there gain unauthorized access to the system. A developer should consider as may types of input as possible when coding exception handling in order to avoid falling victim.

Race conditions are another favorite tool of hackers. In October 2010, hackers exploited a race condition vulnerability in Firefox using the nobelpeaceprize.org website. Adobe Flash has been much maligned by Apple for the number of similar race condition issues that can be used to download malware to a machine running Flash on a website. This race condition occurs when multiple processes access and manipulate the same data concurrently, and the outcome of the execution depends on the order in which the access takes place. A hacker can exploit this by replacing the data that a privileged and legitimate function is using for its work. Depending on the nature of the function and the data that is replaced, it may allow the hacker to gain unauthorized access to the system. To guard against this, developers should closely monitor what data spaces are being allocated in RAM (and particularly on the heap) and which functions have access to those spaces. It is also advisable to use a memory protection unit (MPU), if one is available on the chosen MCU, which may provide code read protection.

Take care

Designing a secure system is very difficult and it takes a great deal of forethought to ensure that security is overarching in all aspects of the design. Moreover, rigorous testing is needed to ensure that the system is not vulnerable to some of the more commonly-known exploits that plague systems today. While testing for these vulnerabilities is daunting, an engineer should take heart in the knowledge that there are commercially available tools that statically analyze the system’s code looking for such vulnerabilities. Rigorous testing is also paramount to help find any holes in the security of the system. However, the best way to secure a system is to know how a hacker thinks and to therefore anticipate the ways in which the system will be attacked. This knowledge helps to seed the imagination of a developer to find ways to thwart the efforts of the hacker. Security is not accidental – it comes from rigorous planning.

Using one of many available software design tools, like IAR Systems’ Embedded Workbench, can help a developer achieve the security they need. The Embedded Workbench allows a developer to take advantage of peripherals on the hardware and integrates with many static analysis tools to help root out as many security holes as possible. Embedded Workbench can also be used in conjunction with testing tools to map out test cases for the system. With the right tools, careful analysis of the system design and rigorous testing, a developer can be confident that their system is as secure as possible. ■

Advertisement



Learn more about IAR Systems

Leave a Reply