Call for smart home devices to bake in privacy safeguards for kids

A new research report has raised concerns about how in-home smart devices such as AI virtual voice assistants, smart appliances, and security and monitoring technologies could be gathering and sharing children’s data.

It calls for new privacy measures to safeguard kids and make sure age appropriate design code is included with home automation technologies.

The report, entitled Home Life Data and Children’s Privacy, is the work of Dr Veronica Barassi of Goldsmiths, University of London, who leads a research project at the university investigating the impact of big data and AI on family life.

Barassi wants the UK’s data protection agency to launch a review of what she terms “home life data” — meaning the information harvested by smart in-home devices that can end up messily mixing adult data with kids’ information — to consider its impact on children’s privacy, and “put this concept at the heart of future debates about children’s data protection”.

“Debates about the privacy implications of AI home assistants and Internet of Things focus a lot on the the collection and use of personal data. Yet these debates lack a nuanced understanding of the different data flows that emerge from everyday digital practices and interactions in the home and that include the data of children,” she writes in the report.

“When we think about home automation therefore, we need to recognise that much of the data that is being collected by home automation technologies is not only personal (individual) data but home life data… and we need to critically consider the multiple ways in which children’s data traces become intertwined with adult profiles.”

The report gives examples of multi-user functions and aggregated profiles (such as Amazon’s Household Profiles feature) as constituting a potential privacy risk for children’s privacy.

Another example cited is biometric data — a type of information frequently gathered by in-home ‘smart’ technologies (such as via voice or facial recognition tech) yet the report asserts that generic privacy policies often do not differentiate between adults’ and children’s biometric data. So that’s another grey area being critically flagged by Barassi.

She’s submitted the report to the ICO in response to its call for evidence and views on an Age Appropriate Design Code it will be drafting. This code is a component of the UK’s new data protection legislation intended to support and supplement rules on the handling of children’s data contained within pan-EU privacy regulation — by providing additional guidance on design standards for online information services that process personal data and are “likely to be accessed by children”.

And it’s very clear that devices like smart speakers intended to be installed in homes where families live are very likely to be accessed by children.

The report concludes:

There is no acknowledgement so far of the complexity of home life data, and much of the privacy debates seem to be evolving around personal (individual) data. It seems that companies are not recognizing the privacy implications involved in children’s daily interactions with home automation technologies that are not designed for or targeted at them. Yet they make sure to include children in the advertising of their home technologies. Much of the responsibility of protecting children is in the hands of parents, who struggle to navigate Terms and Conditions even after changes such as GDPR [the European Union’s new privacy framework]. It is for this reason that we need to find new measures and solutions to safeguard children and to make sure that age appropriate design code is included within home automation technologies.

“We’ve seen privacy concerns raised about smart toys and AI virtual assistants aimed at children, but so far there has been very little debate about home hubs and smart technologies aimed at adults that children encounter and that collect their personal data,” adds Barassi commenting in a statement.

“The very newness of the home automation environment means we do not know what algorithms are doing with this ‘messy’ data that includes children’s data. Firms currently fail to recognise the privacy implications of children’s daily interactions with home automation technologies that are not designed or targeted at them.

“Despite GDPR, it’s left up to parents to protect their children’s privacy and navigate a confusing array of terms and conditions.”

The report also includes a critical case study of Amazon’s Household Profiles — a feature that allows Amazon services to be shared by members of a family — with Barassi saying she was unable to locate any information on Amazon’s US or UK privacy policies on how the company uses children’s “home life data” (e.g. information that might have been passively recorded about kids via products such as Amazon’s Alexa AI virtual assistant).

“It is clear that the company recognizes that children interact with the virtual assistants or can create their own profiles connected to the adults. Yet I can’t find an exhaustive description or explanation of the ways in which their data is used,” she writes in the report. “I can’t tell at all how this company archives and sells my home life data, and the data of my children.”

Amazon does make this disclosure on children’s privacy — though it does not specifically state what it does in instances where children’s data might have been passively recorded (i.e. as a result of one of its smart devices operating inside a family home.)

Barassi also points out there’s no link to its children’s data privacy policy on the ‘Create your Amazon Household Profile’ page — where the company informs users they can add up to four children to a profile, noting there is only a tiny generic link to its privacy policy at the very bottom of the page.

We asked Amazon to clarify its handling of children’s data but at the time of writing the company had not responded to multiple requests for comment. Update: A company spokesperson has now sent us the following statement:

Amazon takes privacy and security seriously, and FreeTime on Alexa is no different. During set up, the Alexa app asks for parental approval and provides information on the privacy and security of their children’s voice recordings. FreeTime on Alexa voice recordings are only used for delivering and improving the Alexa voice service and FreeTime service—they are not used for advertising or Amazon.com product recommendations to children. None of the skills included with FreeTime Unlimited have access to or collect personal information, and we do not share audio recordings with skill developers. Parents can access all voice recordings associated with their child’s device in the Alexa app, and delete them individually or all at once, which also deletes them from Amazon’s servers.

We’ve added detailed information about Amazon’s privacy practices to both the Alexa Help FAQ page and Echo Dot Kids Edition product page. FreeTime on Alexa and Echo Dot Kids Edition is compliant with the Children’s Online Privacy Protection Act (COPPA).

The EU’s new GDPR framework does require data processors to take special care in handling children’s data.

In its guidance on this aspect of the regulation the ICO writes: “You should write clear privacy notices for children so that they are able to understand what will happen to their personal data, and what rights they have.”

The ICO also warns: “The GDPR also states explicitly that specific protection is required where children’s personal data is used for marketing purposes or creating personality or user profiles. So you need to take particular care in these circumstances.”