Posted by Jim Ivers on March 15, 2017
Originally posted on SecurityWeek.
The prediction business is a tricky thing. You can be right, but until you are proven right, you’re either early or wrong. Being early feels just like being wrong–up until the moment you are right.
When toymaker VTech announced in November 2015 that nearly five million customer records had been leaked (including pictures of and data about children), I predicted that the breach would be a tipping point for security and privacy issues with connected toys. My thesis was based on the notion that nothing stirs the emotions faster than concerns over the privacy and safety of children.
My prediction didn’t get any traction. Just as I was beginning to embrace the notion that I was wrong, a string of recent events may prove that I was just early.
In mid-February, it was reported that Germany’s Federal Network Agency issued a warning to parents about the “My Friend Cayla” doll. The agency, which oversees telecommunications in Germany, advised parents to destroy the doll because it collects and transmits conversations with children.
The data in the conversations were being parsed by speech recognition software that can turn dialogue into searchable queries. While the agency based their warning on the doll being a “concealed transmitting device” that ran afoul of the law, there was also much concern over regulations protecting the privacy and security of children. Agencies from multiple countries, including the United States’ FTC, expressed concerns over these privacy and security issues.
In early March, it was reported that toymaker Spiral Toys had been hacked, exposing data from over 800,000 users. The data contained personalized voice messages, pictures, and other data collected via Internet-connected teddy bears and the associated smartphone apps. Researchers reported that the data was stored on a database that was unprotected and not behind a firewall. The same researchers believe the data was held for ransom before being exposed by multiple sources.
As a parent, I find these breaches of privacy and security reprehensible. As someone in the software security space, I find these breaches to be inevitable, yet easily preventable. As a citizen of the world, I view this as a continued warning about the dangers of IoT and connected everything.
As I have said repeatedly, the term “connected device” should immediately provoke questions such as “to what?”, “for what purpose?”, and “with what level of protection for the data?”
I do not believe that there is malicious intent on the part of the toy manufacturers. They are looking for an angle to sell toys, and IoT and connected devices are hot topics. They are also financially motivated to hold down production costs for profitability. Having a connected toy adds new cost items such as building the associated app and building the infrastructure (including data storage) to store the collected data. All their key business drivers (e.g., time to market and profitability) are diametrically opposed to notions of building security into the process.
Take note that this is not a set of issues unique to connected toys. Multiple stories came out in February on the analysis of the end user license agreements for smart televisions. Manufacturers are now warning us not to discuss sensitive subjects in front of our televisions as the conversation will be recorded and stored! This includes the voices of children in our homes.
On a personal note, I am in the market for a new home thermostat. Buying a smart thermostat causes me to pause because I know they listen constantly–just like a smart TV or your new Alexa. Fortunately, I also know several manufacturers of smart thermostats are treating the security issues seriously.
Get the latest AppSec news and trends sent directly to you.