Close×
Privacy | Topics

Smart Toys: Don’t tell Teddy!

By : Jean-Benoît Nadeau

There will be more smart toys than ever under the Christmas tree this year. Youngsters will be delighted, but so will the companies that are collecting their personal information.

Ho! Ho! Ho! Ho! Christmas is coming and smart toys—there are more than 300 on sale in Canada right now–are taking up more and more room in Santa’s sack. There’s the smart toy teddy bear who can remember your child’s name and tell her stories, and the robot dog Chip, who can recognize your child’s voice and fetch a ball. The wise dinosaur Dino comes up with clever answers to everything while transmitting junior’s questions via  Internet to a database in the cloud.

They’re just like little elves! Some even claim to be educational. Except that with these gadgets, your child may end up being toyed with, and by some seriously ill-intentioned people. As the study Children under Surveillance: Privacy Protection in the Age of Smart Toys reveals, these electronic devices, whether connected to a tablet, a smart phone or directly to WiFi, all pose serious threats to privacy and open the possibility of security breaches.

One of the cases the study focuses on is VTech, a manufacturer of tablets for kids. In 2015, hackers found their way into the company’s servers and accessed the names, birthdays, photos, voice recordings and text messages of millions of youngsters, including 316,000 in Canada. And in 2017, the German government told parents to destroy every “Cayla” doll because of the risk of spying and child manipulation.

Some predict that by 2020, there will be more than 20 billion connected objects in the world. By 2025, there will be three times as many, and these will include countless smart toys. In the nebulous world of connected objects, smart toys are a special case because of their particularly vulnerable target clientele: very young people who are in no position to judge the situation or its consequences. “Even the parents we interviewed are mostly unaware of the risks,” says Alexandre Plourde, a lawyer at Option consommateurs who co-authored the study. “The two main issues are transparency and cyber security,” he says.

A Present Danger

Smart toys can conceal a host of microprocessors and gadgets, like an optical reader, a global positioning system, a WiFi antenna, a gyroscope, a camera, or a microphone. In some cases, the microphone and camera are found inside a mobile device to which the child is permanently attached. “This gives companies huge data collection potential,” Plourde adds. “Where is this data stored? In which country? Who can gain access to it? Will it be sold? What will be done with it? There’s no way of knowing.”

The study’s three co-authors, all lawyers, combed through the privacy policies of ten of these toys. In many cases, the policies can be summed up as saying that if you install the application or software,  you are agreeing to a privacy policy that can be changed unilaterally and without notice. “At the very least,” Plourde says, “the packaging should list standardized information, like the nutritional table on food products does.”

Naturally, the authors wanted to find out how any of the data collected would be used. A few of the policies stated that the data would be rendered anonymous, but they offered no further clarification of how this would happen. In the case of Cozmo, a camera-equipped robot in the form of a bulldozer, the manufacturer Anki goes to the trouble of stating that “the images captured by the Cozmo robot camera and the biometric data of the facial image will not be transferred to the Internet.” As for the rest, who knows?

Several companies claim no personal data is being recorded. Since there is no regulation of, and no legal means for verifying these claims, consumers have no choice but to accept what manufacturers say. The consumer is offered no option to refuse to allow the company to collect, use or resell information collected, no matter how confidential it may be. In the case of the Furby Connect toy, manufacturer Hasbro states that it will keep the personal information only for as long as the account remains active. But who’s going to think about “deleting an account” when they dispose of a toy that’s no longer being used?

In addition to the totally unresolved issue of consent, cyber security is another concern. “The big surprise for me was in the interviews with cyber security specialists. They made me realize just how little security there is in this environment,” Plourde says. Among the most glaring vulnerabilities is the absence of any encryption of communications. Even when they do use encryption, many companies don’t bother to require a password to block access.

Another problem is the presence of “back doors,” a feature built in by programmers that lets them perform tests and maintenance without the user’s knowledge. But if hackers find a way in, the device can become a “Trojan horse.” The hacker can then infiltrate the rest of the home’s computer system, or use the toy’s voice to persuade the child to do certain things.

“In a context where security is very weak, allowing such access is a terrible practice. Just think about the fact that on many platforms, the safety questions you are asked are about childhood memories, such as your first pet, or the name of your mother or your street. If this kind of information collected by a toy is stored somewhere, anyone could find it and use it.”

Alexandre Plourde, lawyer at Option consommateurs and co-author of the study

Send for Robocop?

The current law provides only a modicum of recourse. The spirit of the federal Personal Information Protection and Electronic Documents Act (PIPEDA) requires that companies implement security measures that are proportionate to the sensitivity of the data. Everything is based on the notion of consent, which is supposed to be free and informed. But can anyone reasonably maintain that every time you give them children a present they  are consenting to having their personal information used by others for the rest of their lives? Does the fact that their parents consented on their behalf, maybe without  reading the privacy policy, make this consent valid?

Alexandre Plourde is pleased that Quebec Justice Minister Sonia LeBel wants to beef up the applicable provincial law (the last update was in the 1990s, during the prehistory of the Web). “The fundamental problem is that the law is not a sufficient deterrent. It takes investment and expertise to ensure effective cybersecurity. Sanctions in Canada are not tough enough, however; we have to make it very expensive not to do things right.”

Most of the study’s recommendations are in line with recent European directives. Substantial fines need to be imposed and privacy protection has to be integrated within the design of the objects themselves. They also have to be tested.

 

“For physical goods, the Canadian government requires compliance with several standards relating to the toxicity or physical safety of everything sold. But when it comes to the digital security of smart toys and connected objects in general, the government has nothing to say, not even about passwords.”

Alexandre Plourde

One bright spot at the moment is that connected toys rarely live up to their promises. The authors of the study found that most children get tired of them very quickly. Some toys are just animal-shaped walkie-talkies, while others churn out ready-made answers that don’t apply, so a child is not likely to stay interested for long. The talking doggie ends up being used as an ordinary plush toy. Moreover, since all these gadgets “speak” English only, interaction with young unilingual Francophones is far from optimal. Yet interviews showed that some children do indeed develop a connection with their smart toy. This has happened with several children and the Cozmo bulldozer robot, which has a wide range of emotions and behaviour. For one little boy, the attachment became so strong he called Cozmo his best friend.

“Thanks to artificial intelligence, these toys will improve and become more and more convincing; something like personal voice assistants, which are evolving quickly,” Plourde says. If children start accepting them as actual companions to interact with, this could raise major ethical and psychological issues, not to mention seriously compromising privacy of information.

The Study

Children under Surveillance: Privacy Protection in the Age of Smart Toys describes the risks posed by connected objects that specifically target children. The authors, lawyers Valérie Montcalm, Élise Thériault and Alexandre Plourde—the latter two still members of the team at Option consommateurs—consulted a dozen specialists, including engineers, cryptology or security experts, and child psychologists. They compared the Canadian regulatory framework with that of the United States and the European Union and came up with a detailed analysis of the privacy policies for ten toys. The authors also tested the reactions of 20 children aged 5 to 10, to four selected toys, as well as those of their parents, gathered in a series of interviews.

The study identifies serious cyber security vulnerabilities to which these toys expose families and children. In addition, the existence of such connected toys, even the least sophisticated among them, raises serious questions about the confidentiality of children’s personal information, which may well be of a particularly sensitive nature.

The authors’ recommendations are based on the European Union’s General Data Protection Regulation. They call for more preventive measures in the design and marketing of toys, better information for consumers and more robust regulation backed up by tougher penalties.