Time for more robust IOT oversight

Infosec and privacy concerns are going unaddressed in the internet of things ecosystem, and some government agency has to take the lead.

IoT security
 

My empty-nester parents just downsized from my childhood home to something more manageable for two people. Now that they're getting older, I worry more about them and recommended that they install a security system as part of their recent home purchase. This system, I thought, would give me peace of mind and also alert authorities in the event of an emergency.

But what they found during their search for home security systems didn't put my mind at ease.

They kept running across Internet of Things-enabled options, which provide remote monitoring of their home and the ability to provide one-time access codes to house- or pet-sitters -- all through the convenience of their mobile phones. They were intrigued by these options, but I had immediate concerns that such systems introduced more harm than good.

My primary questions included: How are in-app communications secured? Who has access to this data by default? Where does the information reside? What authentication mechanisms are used for remote system access? And most importantly, who is watching out for consumers in this IoT era?

This last question was somewhat answered just days ago when the U.S. Consumer Product Safety Commission announced it would hold a May 16 hearing on the potential safety issues and hazards associated with internet-connected consumer products. But there's a catch. The CPSC makes it very clear that personal data and privacy concerns are not within scope for this hearing as the commission does not have the legal authority to oversee such hazards.

According to the commission, product safety challenges of IoT products fall into two categories:

  • Prevention or elimination of hazardous conditions designed into products intentionally or without sufficient consideration (e.g., high-risk remote operation or network-enabled control of products or product features).
  • Preventing and addressing incidents of hazardization, which it defines as "the situation created when a product that was safe when obtained by a consumer but which, when connected to a network, becomes hazardous through malicious, incorrect, or careless changes to operational code."

In other words, the commission is primarily concerned with physical safety hazards that could result from use of IoT products. This might include "remote operation hazards," where, for example, unintended remote activation of heating elements on a stovetop could pose fire or burn hazards. Or using the home security scenario, the commission wants to address potential loss of safety function. For example, an integrated home security system that fails to download a software update properly, potentially resulting in a deactivated system -- including disabled smoke alarms -- without consumer knowledge.

But these are not the only user risks that could result from an IoT product. What if that same IoT-enabled home security system were hacked through a fundamental flaw in system security design? In this scenario, consumers could be spied on -- a clear violation of their personal privacy and security. Even more alarming, a hacker could determine through intercepted in-app communications when they're away from home and gain unauthorized access. Unfortunately, it's clear that the prevention of these latter scenarios will not be addressed at the upcoming hearing.

This is problematic because, as I've said before, the market alone will not solve the information security design concerns related to consumer technology. Companies have little incentive to spend precious resources on information security prior to product launch, and the average consumer is not tech-savvy enough to question how a product makes their digital communications more or less safe. This approach will not change without intervention by outside forces. We need a governance system -- to include enforcement, incentives and penalties -- to ensure effective implementation of stronger security design practices.

I applaud the CPSC for getting smart on the consumer risks associated with IoT. This is an encouraging step in the right direction, but if the commission does not have the legal authority to protect consumers from IoT generated personal data and privacy risks, then who does?

The Federal Trade Commission is emerging as the data breach cop – taking a full-throated approach to punishing companies for lax information security practices, but the FTC does not regulate the fundamental information security design of consumer technology products. The Food and Drug Administration does valuable work in this space, but its scope is limited to medical devices.

I would like to repeat my call for a new or existing federal agency be charged by Congress with governing, incentivizing and enforcing security design standards for technology products. The proposed organization -- call it the Consumer Technology Security Commission -- would be responsible for:

  • Coordinating the development of security design standards and partnering with Congress to mandate relevant standards;
  • Building an accreditation and certification program; and
  • Enforcing quality through regular testing by third-party assessors and conducting recalls, when appropriate.

Regardless of the solution, it is inevitable that the government play a stronger role in addressing risks to consumers posed by IoT. Without proper protections, we may be entering an era where the very systems we rely on for personal protection are aiding malicious actors without our knowledge.