AspenCore Media has taken a deep dive into the question of Where Security Meets Privacy in the 21st Century. Included in this Special Project are: Sitting at the Crossroads of Cybersecurity and Privacy, Designing Hardware for Data Privacy, and Facial Recognition: The Ugly Truth.
By Gina Roos, editor-in-chief
Topics around the surveillance economy and the right of digital privacy are increasingly becoming more hotly debated as tech giants like Amazon, Facebook, and Google become the face of these privacy issues as they continue to collect, store, and sell personal data from millions of people globally. But privacy issues are not just a concern for digital service and software providers; OEM designers of IoT devices and any connected device need to understand how data is collected, stored, shared, and used in order to protect personal information. These connected devices and subsystems span across industries including medical, consumer electronics, and automotive. Now add artificial intelligence (AI) and machine learning to the mix and it takes privacy to a whole new level.
The U.S. at a national level does not have any privacy regulations. However, California is driving the charge for stringent laws at the state level. So why should U.S. engineers and designers care? Because we live in a global economy, and in the wake of the European Union’s General Data Protection Regulation (GDPR), which calls for privacy by design and privacy by default, though not new concepts, OEMs need to be compliant if they want to sell end products and digital services into the EU. In addition, other countries like China and Russia, though more focused on state security, have made similar moves toward protecting their citizens’ data.
What the GDPR strives to do is harmonize data privacy laws across Europe to protect all EU citizens’ data privacy. This means a big change for companies in terms of how they approach data privacy, especially in the U.S., where many companies evolved in the age of the internet. Collecting, storing, and selling data became their sole revenue stream without having to think about privacy issues, and now, with mounting societal concerns around data privacy, it’s creating greater tension between the rights of individuals and digital companies.
It’s like trying to put the genie back into the bottle and realizing it’s too late, so regulators and industry organizations like ECIA, IEEE, and NIST need to step in.
Privacy needs to be part of the designer’s tool box, said Robin Gray, COO and General Counsel for ECIA . “There was an EU regulation that pre-dated the GDPR that, in a stronger sense, mandated privacy by design. But GDPR opened the door to privacy by default. They recognized that there were so many systems out there now that were leaving a big door open for privacy invasion.
“In Europe, they already had privacy by design. Although it wasn’t really well-known or well-practiced, it really was the impetus of the GDPR, which gives it some legal heft. And IoT is an excellent example, especially early on. No thought was given to privacy by design. No one recognized the threat it could be, and now, GDPR has gone a long way toward bringing global recognition to the importance of protecting data privacy.
“The preferred method is privacy by design — any time you are getting ready to process personal data, you need to design it in at the beginning stages, where it’s an integral part of the product.”
Privacy by design and privacy by default
“Privacy by design refers to processing of personal data with data protection and privacy in mind in every step of product development,” explained Judith Myerson, owner of consultancy at Judith M. Myerson. “It prioritizes privacy and data integrity in the initial design stages and throughout the development life cycle of new products and services.”
According to Deloitte, the seven core principles of privacy by design are:
“Privacy by default means that once a product or service has been released to the public, the strictest privacy settings should apply by default without any manual input from the end user,” Myerson said. “It also means that any data or information provided by the user to enable a feature of the product should only be kept for the minimal amount of time needed to make the product or service function properly.
“Addressing privacy issues after a product has been implemented has a negative impact on end users,” she added. “Privacy by design documentation must be made available to a European regulatory authority in the event of a data breach or a consumer complaint — whether the developer/designer resides in Europe or is located out of Europe and has European customers.”
Myerson said that key things designers need to consider include:
- What personal data needs to be collected
- How to minimize the amount of collection
- What data-retention policies are
- What contracts and agreements with European customers say about privacy regulations
- How to educate end users on GDPR
- How machine learning can be applied to data privacy
- How long the data should be retained
- How privacy issues should be documented
- How to protect privacy data in a spreadsheet
“One of the simple rules is, if there is an on and off button, privacy by design says leave it off and allow the individual to turn it on,” said Katryna Dow, founder and CEO, MEECO; co-chair for the Personal Data and Privacy Committee, IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems; and chair for P7006 — Standard for Personal Data Artificial Intelligence Agent. She is also a founding member of The Council of Extended Intelligence (CXI), a joint initiative between the IEEE Standards Association and MIT Media Lab.
IEEE said that the P7006 standard hopes to educate government and industry on why it is best to put mechanisms into place to enable the design of systems that will mitigate the ethical concerns when AI systems can organize and share personal information on their own.
“Our current paradigm is that everything is on,” said Dow. “It’s not obvious that it’s on; you’re not often aware that it is on; you don’t know how to turn it off; you’re not aware that you have the right to turn it off; and if you try to turn it off, things are designed in such a way that it is so difficult that you will give up trying.
“What privacy by design is trying to do is have the designers think about the fact that somebody may prefer things to be off rather than on and untracked rather than tracked, and the option to turn that on should be easy and seamless.”
Dow compares privacy issues in the digital world to the Wild West, where, for a period of time, there were no rules and, as communities grew, they started to moderate behavior. “Privacy by design is simply the way communities are starting to moderate behavior and starting to act in the best interest of individuals and protect them ideally from the design perspective.”
U.S. vs. EU
Dow said that the reason why it’s so important for designers now is that there is a big difference between the cultural perspective between Europe and the United States. “In Europe, privacy is seen as a human right and human rights are defended.”
“More and more, everything we do digitally creates a digital twin, so there is our physical self and our digital self,” explained Dow. The EU is concerned about the long-term impact on privacy and human rights without the means to say we’re okay with the way our digital twins are being tracked, she said.
“In the U.S., the views in terms of human rights or privacy are often overshadowed by some of your amendments around freedom of speech or right to know,” Dow continued. “Those things haven’t been reconciled in a way where you can be free to say something and also free to be private.
“What we’re starting to see in states like California is that you shouldn’t have to choose between one or the other. They decided to bring in regulations to increase privacy rights. They are also looking to ban facial recognition in public places. One of the challenges in the U.S. is that you might end up with as many privacy laws in terms of collecting and processing in accordance with as many states as you have, which can make it very complex.
“Something like privacy by design is a way to harmonize that. If you have this as an ethical approach in the way you design a digital service, then you are probably compliant with GDPR. And you will have better protection for all the changing regulations coming about. If American companies are providing multi-national services into Europe or other parts of the world, they may find themselves responsible for privacy by design principles even if they are an American company.”
“Ultimately, you need a national standard rather than state by state because it really will impede internet commerce,” said ECIA’s Gray. “In the U.S., the debate is who owns the data — the company who gathers that info, whether it has permission or not, or if you own the personal data about yourself.
“The internet [evolved] in this country based on gathering and selling that data without your permission, so a lot of enterprises and the digital economy is built on it, so there is going to be an interesting dialogue as to how the tension between the protection of privacy and commerce is going to work out,” said Gray.
Privacy and security
The key U.S. and global privacy and security regulations that designers need to understand include SOX, NIST, ISO, and GDPR, said Myerson. But she pointed out that GDPR provides very little guidance for companies on reviewing their existing security measures and information security frameworks.
“According to European data protection law, personal data is any information that could be used to identify the individual,” said Myerson. “This includes name, address, email address, IP address, banking or other personal information, medical information, photographs, social networking posts — any data that can be used to identify a person. To avoid collecting unnecessary data, all contracts and agreements should be reviewed on what personal data is required.
“Privacy needs to be part of the early design process/product development cycle and in every step of the cycle,” she said. “Addressing privacy needs after a product has been implemented and deployed has a negative impact on end users and developers. It’s costlier to fix the problem in later steps than in the early steps of the cycle.”
Myerson recommends several questions that designers should ask themselves early in the design process pertaining to privacy issues. They should be thinking about:
- How data is stored
- How data is secured
- How long the data should be retained (note that GDPR data retention in Article 5(c) doesn’t give data due dates like SOX does)
- What the access policies of an organization are
- What data collected needs to be purged
- How data requirements differ from one privacy law or regulation to another
- What the privacy issues of data in a spreadsheet are
- What the documentation policies are and how privacy issues should be documented when the designer is located within or outside Europe
To get ready for the GDPR, Myerson recommends that developers and engineers read the article on NIST security controls . The article suggests the following publications to read:
- NISTIR 8062 “An Introduction to Privacy Engineering and Risk Management in Federal Systems”
- FIPS PUB 200 “Minimum Security Requirements for Federal Information and Information Systems”
- Framework for Improving Critical Infrastructure Cybersecurity
- NIST 800-53 “Security and Privacy Controls for Federal Information Systems and Organizations”
She said that the first is useful in implementing privacy by design and by default and could be applied to non-federal information systems. The next two provide general structure for evaluating your information-security framework. The third refers to Executive Order 13636 that requires that the framework include a methodology to protect individual privacy and civil liberties when critical infrastructure organizations conduct cybersecurity activities in both federal and non-federal information organizations. The last focuses on more concrete solutions to meet the security objectives.
She also recommends that engineers read ISO standards included in the references of NIST publications and how SOX is applied, although it is not targeted toward the engineers. The engineers should also understand how various privacy regulations — U.S. and global — can be applied to machine-learning application of big data.
Currently, NIST is seeking comments by Aug. 5, 2019, on a draft whitepaper, “Mitigating the Risk of Software Vulnerabilities by Adopting a Secure Software Development Framework (SSDF) ,” which recommends a core set of high-level secure software development practices to be added to each software development life cycle (SDLC) implementation. You can send comments to .
IEEE’s P7000 Series Standards in development include 13 standards focused on the ethical development and/or processes and/or governance around intelligent and autonomous technologies, many of them related to privacy, including the IEEE P7002 IEEE Standards Project for Data Privacy Process , which specifies how to manage privacy issues for systems or software that collect personal data. IEEE said that the standard will help designers by providing ways to identify and measure privacy controls in their systems utilizing privacy impact assessments.
The impact of data privacy is across software, firmware, and hardware. “When you think of devices like Nest or Alexa devices, or any of those [connected] home systems, they are all hardware,” said Dow. “Privacy by design is as important in the way that the hardware is designed for how the operating system works and how the software runs and collects data.
“At the same time, security by design is emerging — how you can make this hardware or IoT device as secure as possible, while privacy by design aims to make sure those IoT devices are set up in a way to limit the harm of how that data is streamed.”
Dow said that there is increasing concern around whether a focus only on software and services is enough. “At first, privacy by design was focused on software and digital services. But increasingly, some of the work being done by IEEE through the P7000 series, together with the Council on Extended Intelligence and MIT, is now starting to find that this might need to be a design principle at the hardware level and at the chip level.”
IEEE is leading the way with a number of initiatives to explore where that privacy and control should be and how deeply embedded it should be in the design process.
But the work is challenging and groundbreaking, particularly with different laws in different parts of the world and so many data and privacy issues, said Dow. “Trying to come up with a harmonized approach that is able to be implemented on a global basis is incredibly challenging.
“IEEE is questioning a lot of the practices that have come out of the last 20 to 30 years of tech development, software services, and digital mobile and saying that just because it’s technically possible doesn’t mean it’s ethically right. Part of this ethical debate is that things have moved very quickly and have unintended consequences, and maybe now is the right time to start questioning whether those things are ethically right to continue as they are.”
There is still lots of work to be done around how deep in the supply and design chain privacy by design applies, and the U.S. electronics industry needs to take a bigger role in developing these regulations.
Privacy Versus Security:
These two notions have never been mutually exclusive, but today's technological developments have been increasing the tension between them.
Check out all the stories inside this Privacy and Security Special Project
Where Security Meets Privacy in the 21st Century
Since time immemorial, humans have been concerned with the subjects of security and privacy, but the convergence of many of today's technologies — especially in the form of the internet the Internet of things (IoT) — mean that the stakes have never been higher.
Designing hardware for data privacy
Ensuring privacy of electronic data requires data security, but a secure design does not necessarily assure data privacy. Developers must consider the two together.
Sitting at the Crossroads of Cybersecurity and Privacy
The combination of the headline worthy data breaches and new privacy legislation have put data protection and privacy on the top of the agenda for electronics OEMs.
Facial Recognition: The Ugly Truth
AI is making automated facial recognition for mass surveillance a reality — but at what cost?
High-Tech Distributors Grapple with Security and Privacy in the Digital Age
In the midst of the digital revolution, the stakes for electronics distributors trying to safeguard the privacy and security of customers is constantly on the rise.
Enhancing privacy and security in the smart meter lifecycle
Concerns about security and privacy of connected devices coalesce in the lifecycle of smart meters. Here's how IoT platforms help protect smart meters and their data despite an ever-growing number of threats.
Chip Security Emerges a Hot Topic in the Supply Chain
As more electronics devices are connected and hence hackable, OEMs are having to bring good security practices, designs, and devices into their products as soon as possible.
Also check out these related columns
The Illusion of Security
This mini-series of articles explains how today's cyber security is like a bucket with hundreds of holes, and each software solution is a patch to a single hole. We don't need more patches; we need a new bucket!
Privacy Issues with Voice Interfaces
Voice interfaces are only going to get more common, and there is a great market opportunity for those vendors that get their product and its approach to privacy correct.
Security in Semiconductor Manufacturing
Today's manufacturing lines are increasingly prone to IP theft and reverse engineering attacks. Savvy chipmakers know to institute secure systems to guard against them.
Will the Real Root of Trust Stand Up?
Not all roots of trust are created equal, nor are they all implemented in the same fashion on silicon.
How Many Layers of Security Do You Have?
Depth of defense and principle of least privilege are two concepts system and SoC designers must embrace as they seek security answers for their designs.
Multiply and Isolate Your Roots of Trust for Greater Security (Part 1)
Security designs can have multiple entities, as well as isolation, among separate applications on a chip.
Multiply and Isolate Your Roots of Trust for Greater Security (Part 2)
In order to give you confidence, you want assurances that all applications in your secure silicon IP are isolated from each other.
Learn more about Electronic Products Magazine