Connected toys, a hot holiday seller, aren’t all fun and games. What questions should buyers be asking about the security and privacy of connected toys?
Once again, while you won’t hear it blaring from any shopping mall loudspeakers (and with apologies to the late crooner Andy Williams and songwriters Eddie Pola and George Wyle), the risks still lurking in many connected toys are enough to make yet another holiday season “the mooost dangerous tiiiime of the year.”
That is not the Grinch singing. It’s the compassionate, concerned Federal Trade Commission’s (FTC) Division of Consumer and Business Education, which, to upend another popular holiday slogan about being jolly, notes the reality that ’tis the season to be wary.
The agency issued its annual warning earlier this month to shoppers looking to give their children “cuddly cool internet-connected smart toy to help them learn.”
But many connected toys come with both privacy and safety risks. So the FTC offered a “handy list of questions” that all parents and guardians ought to ask before buying a connected toy:
All these are good questions, but they leave at least a couple of gaping holes in ensuring that such toys are safe and secure—which ought to be especially important since they are “for the children.”
First, they imply that mainstream (as in, generally not tech-savvy) consumers are responsible for vetting the security and privacy capabilities of toys. These toys are among the billions of devices that comprise the Internet of Things (IoT)—otherwise known as the biggest “attack surface” in the world for hackers and cyber criminals. It’s a bit like implying that the consumer, not the manufacturer, is responsible for ensuring that vehicles and major appliances are safe to use before buying them.
The NCC Group, a cyber security and risk mitigation firm, tested a number of toys on the market this year for the consumer organization Which?. They suggest that consumers are at least as responsible as vendors for knowing the safety and security provisions built into, or not built into, those devices.
“While the onus should never fully lie with parents or guardians, checking that the product literature has sufficient reference to security and privacy before purchasing should be the first step,” the company said in a report on its findings.
Second, while the questions cover the privacy component fairly well, security looks to be almost an afterthought for the FTC. It is the last one on the list and simply says: “Is the information secure?”
How is a consumer supposed to divine that? Especially when the “product literature” is likely to be the same kind of dense legalese as “terms of service” agreements? Most users click “agree” without reading a word. They bought the app or device, they want to use it now, not an hour later, and they know that if they “disagree,” they won’t be able to use it at all.
But even if they do read it, the answer to the “is the information secure?” question is this: Not so much.
According to NCC, which looked at seven popular connected toys, “The idea behind a connected toy is good—electronic and IT capabilities are used to take the concept of a toy away from an inanimate object, and to give it some capability to interact with a child.”
But the top priorities of manufacturers looking to create that interactive capability are “cost of production and time to market.” That leads to “the low levels of security commonly observed in connected toys.”
How low? Low as in not even doing the basics.
Among other weaknesses, NCC found:
Plaintext log-ins and authenticated sessions. Some of the toy websites had “no encryption on account creation and account logins,” meaning usernames, passwords, and all associated account and session information, were “open to interception.”
Username and email address enumeration. “When creating new accounts, or using the ‘forgotten password’ function, the websites commonly returned messages that would indicate whether a given username or email address was already registered.”
Weak password policies. Actually, beyond weak: NCC found that none of the websites supporting the devices enforced a password policy. “On all of them, it was possible to set a password of ‘password,’ which is incredibly weak and highly guessable. This lack of a password policy, coupled with the username and email address enumeration issue, could be used by attackers to successfully brute-force valid username and password pairs to gain unauthorized access to user accounts,” the company said.
Creation of new accounts. And, of course, most of the toys “either required or suggested the creation of an online account,” so the kids can “download new capabilities, or to share aspects or experiences with the toy in online forums with other children.”
NCC “presumed” (probably correctly) that the toy manufacturers could collect that user data “for data analysis and marketing purposes.”
These warnings come around each holiday season with the regularity of Groundhog Day. So the obvious question is, why haven’t security and privacy in children’s connected toys improved?
Larry Trowell, principal consultant at Synopsys, believes some of it is simple ignorance, willful or otherwise. “Toys are still considered by developers simply to be toys,” he said. “They’re not designed to be as hardened as phones or other ‘serious’ devices. So, often they don’t see it as a problem.”
Diligent parents can try to follow all the directives to make sure the device is secure. But in some cases, if the manufacturer doesn’t provide better security options, “there’s not much they can do,” he said.
“A plaintext password is a plaintext password regardless of how complex a customer makes it. An outdated Bluetooth driver isn’t going to develop better encryption, because it was only paired to one device.”
But no matter the reason, for the foreseeable future, protecting children from the privacy and security risks of connected toys is up to those who are buying the toys.
Given that, the somewhat good news is that there is considerable advice available for those buyers.
Besides the FTC’s list of questions, Theresa Payton, CEO and president of Fortalice Solutions, who focuses on child safety and privacy, says that parents should ask three “critical questions” before buying anything that talks to the internet:
Trowell added that nobody should buy a toy that doesn’t provide for firmware updates. “If there are no firmware updates, then the device can’t be fixed if a vulnerability is found,” he said. Also, prospective buyers should check the version of Bluetooth devices the toy is compatible with. “You want as close to Bluetooth 5 as possible,” he said.
Payton said it is worth researching a new report from Mozilla about privacy for any gadget. Keep in mind, she said, that if a toy connects to the internet and can record or play video and audio, “then it can track you, spy on you, be hacked to spy on you.”
Then, once the toy is in the house, Payton recommends installing all updates (the toy may not have shipped to the store with the latest ones). Also, change the default password, create a new email account for it, set up two-factor authentication, and look up how to install manufacturer updates.
Adding to that, Trowell recommends against connecting the toy to the internet “if possible, but if unavoidable, attach it to a guest Wi-Fi account. And check if the service that operates the toy has had any published security issues.”
And Payton recommends that parents of younger children “play internet safety games with them to make sure everyone is up to speed on the latest threats. Some of my favorites are at NSTeens.org and OnGuardOnline.gov,” she said.
Trowell said if he had children, he might buy them a connected toy, but would probably then “take the entire thing apart and reverse as much as I could before allowing anyone to play with it. But that is something that most parents could not do properly.”
Finally, after addressing all the technical security and privacy issues, Payton said parents “need to be where their kids are.”
“Here’s the thing,” she said. “You would never throw them the keys to the car before showing them how to drive, right? You would never let them go on a vacation with another family you haven’t met or run a background check on.”
“But remember that ‘be where they are’ means using Snapchat or WhatsApp. It means playing on their gaming platform, and wherever else they interact online.”
“Make them tell you their accounts and passwords,” she said. “Make a commitment that you will not spy, but you will trust but verify.”
READ NEXT: How to make your home both smart and secure
Taylor Armerding is an award-winning journalist who left the declining field of mainstream newspapers in 2011 to write in the explosively expanding field of information security. He has previously written for CSO Online and the Sophos blog Naked Security. When he’s not writing he hikes, bikes, golfs, and plays bluegrass music.