close search bar

Sorry, not available in this language yet

close language selection

U.K. threatens to force IoT security by design

U.K. threatens to force IoT security by design

Securing the Internet of Things (IoT) seems like an endless reality version of “Mission Impossible”—really impossible. Many have tried—with lists of best practices and standards, exhortations, and warnings—but none has succeeded.

Still, the U.K. government, in a policy paper titled Secure by Design released earlier this month, says it is also going to try, with a 13-point Code of Practice (see sidebar) that it will force all IoT stakeholders to follow if they don’t do it voluntarily.

While the paper is a draft—open for comments until April 18—the opening reviews are mixed. Georgi Boiko, senior consultant in the Synopsys Software Integrity Group (SIG), said the Code of Practice includes some basic security hygiene, some “genuinely solid technical advice,” and an acknowledgment of unsolved problems in the industry.

But he said it also contains obvious “attempts by corporations on the advisory group [that helped develop the draft] to get their products and services on the list of government requirements.”

“It’s a good starting point that should be taken with a grain of salt,” he said.

There is no debate, however, that something—almost anything—would be better than nothing.

The paper, published by the U.K. Department for Digital, Culture, Media, and Sport, doesn’t claim it will make IoT bulletproof. But it does say it will make it a lot better, which wouldn’t take much in an industry where in most cases, security isn’t just an afterthought—it’s not even a thought at all.

IoT, which many now call IoE—the Internet of Everything—remains rampantly insecure. Every week, several times a week, come reports about hacks of everything from cameras to fitness trackers to “smart” home devices, TVs, and more, for purposes ranging from identity theft to ransomware to cryptomining to conscription for botnets to launch DDoS attacks.

Some critics are already complaining that while the measures proposed have some merit, they are also toothless, since they don’t carry the force of law.

Ken Munro of the U.K. firm Pen Test Partners told the Security Ledger that the proposals in the Code of Practice are nothing new.

And he said nothing will happen until recommendations become requirements with penalties for failure to comply.

“We have loads of standards,” he said. “What we don’t have is enforcement.”

True, at the moment. But once the comment period is complete and the policy is finalized, the agency promises—in writing—to become much more aggressive.

Margot James, minister for digital and the creative industries, noted in the executive summary that if those in the IoT space don’t comply voluntarily, they will be forced to do so.

As the summary put it, “the Government’s preference would be for the market to solve this problem…. But if this does not happen, and quickly, then we will look to make these guidelines compulsory through law.”

If that happens, it would be beyond anything even in the works in the United States. While there are a couple of bills now pending in Congress—one titled the Internet of Things Cybersecurity Improvement Act of 2017 and another called the Cyber Shield Act of 2017, filed last October—that call for incentives to improve IoT security, neither would make their recommendations or standards mandatory. And neither bill has even made it out of committee so far.

Experts say, however, that even if the Code of Practice becomes law, it will not necessarily transform the industry quickly.

One problem is that it applies to what the government calls IoT stakeholders—device manufacturers, service providers, mobile application developers, and retailers—but end users aren’t included in that group.

The government says its goal is to “move away from placing the burden on consumers to securely configure their devices, and instead ensure that strong security is built in by design.”

But any attempt to eliminate the burden on consumers is itself a flaw, according to Sammy Migues, senior member, technical, with Synopsys SIG.

“The customer isn’t listed as a stakeholder,” he said. “The other stakeholders can’t take care of everything. Once the device is in the consumer’s hands, many—if not most or all—bets are off.”

Migues said there is plenty that could be added to the list:

  • Should each IoT device have its own definition of “fail secure”? For a plane, that would be “stay up,” but for an elevator, it’s “slowly go down.”
  • Should every IoT device have a “reset to factory settings” physical button so end users can regain control of the things they bought and own?
  • Should IoT devices have hidden law enforcement back doors? (In other words, will I know when law enforcement is watching me on my internal home video because they served ADT a warrant?)

Boiko said while the list is not perfect, if every point could be “implemented throughout the IoT space, it would be a drastic improvement.”

But then, implementation will have its own complications. “I don’t see it being implemented throughout the IoT space within 2018, which seems to be the expectation of the authors,” Boiko said, noting several of what he said are many reasons.

  • The requirements to protect personal data and allow deletion of it are among the core requirements of the General Data Protection Regulation (GDPR), due to take effect in May. “GDPR will be the trial run that will show whether or not legislation can solve this extremely complex information-era problem,” he said.
  • The need for secure updates and the principle of least privilege “have been known for decades,” he said. “But this won’t magically get solved in 2018 by vendors.”
  • Many small businesses and startups, while they can use the guidance of the code, “won’t reorganize their businesses overnight to put security on top,” he said.
  • And finally: “It took the government more than a year to react to Mirai [botnet] and WannaCry [ransomware], resulting in a list of generic and already well-known points. We cannot expect the industry to interpret, figure out the details, and implement these recommendations in less than that amount of time.”

Migues said this effort, like any other to secure IoT, may have the best of intentions, but it is going to collide with reality.

“Within 10 years, these billions of devices will collectively know everything about every human living in the industrialized world, and we’ll have no idea where all the data resides or who’s doing what with it,” he said.

“GDPR allowing me to control my personal data will have zero impact on what manufacturers, advertisers, law enforcement, and everyone else knows about me, because they will have already turned it into metadata that isn’t directly covered by GDPR.”

Beyond that, he said there is nothing specific in the Secure by Design paper that mandates “some kind of software security initiative that can demonstrably produce reasonably secure code.

“This is all about features, and security software does not equal software security,” he said.


The specifics of the Code of Practice are listed in what the U.K. government says is their order of importance.

  1. No default passwords. They must be “unique and not resettable to any universal factory default value.”
  2. Implement a vulnerability disclosure policy. Provide a “public point of contact” for security researchers to report bugs, flaws, or any other software defects that would make devices or services vulnerable to hackers.
  3. Keep software updated. It must be possible to update connected devices securely, and an update should not have a negative impact on the functioning of a device. State clearly to users the “minimum length of time for which a device will receive software updates and the reasons why.”
  4. Securely store credentials and security-sensitive data. That means no hard-coded credentials in device software.
  5. Communicate securely. Encrypt security-sensitive data, including remote management and control and all keys.
  6. Minimize exposed attack surfaces through the principle of least privilege—close unused ports, make unused services unavailable, and minimize code to what is necessary for the service to operate.
  7. Ensure software integrity through secure boot mechanisms.
  8. Ensure that personal data is protected. Provide consumers with clear information about how their data is being used, by whom (including third parties), and for what purposes. Consumers who have given consent for their data to be used should be allowed to withdraw their consent at any time.
  9. Make systems resilient to outages, including power failures.
  10. Monitor system telemetry data for security anomalies.
  11. Make it easy for consumers to delete personal data.
  12. Make installation and maintenance of devices easy.
  13. Validate input data to ensure that systems are not easily subverted by incorrectly formatted data or code.
The tide may be changing, but the rules have not.

Learn more

Taylor Armerding

Posted by

Taylor Armerding

Taylor Armerding

Taylor Armerding is an award-winning journalist who left the declining field of mainstream newspapers in 2011 to write in the explosively expanding field of information security. He has previously written for CSO Online and the Sophos blog Naked Security. When he’s not writing he hikes, bikes, golfs, and plays bluegrass music.

More from Security news and research