Apple, Google questioned by ICO over app age ratings after UK child safety charity raises concerns

Europe

The UK’s data protection watchdog has written to Apple and Google seeking details of how they assess apps to determine the age ratings they apply following concerns raised by an online child safety charity.

The move follows the coming into force of the UK’s Age Appropriate Design Code this September — which puts requirements on digital services that are likely to be accessed by children to prioritize protecting their privacy and safety.

In a statement today the information commissioner, Elizabeth Denham, said her office is currently conducting an “evidence gathering process to identify conformance with the code, and thus compliance with the underlying data protection law”.

The information commissioner was responding to a letter from the 5Rights Foundation — a digital child safety charity which conducted research over the summer to investigate compliance with the Code; and says it found 12 “systemic” breaches, including insufficient age assurance; mis-advertisment of minimum ages for games on app stores; the use of dark patterns and nudges; data-driven recommendations that create risks for children; a routine failure to enforce community standards; low default privacy settings; and plenty more besides.

“In this process, the ICO is taking a systemic approach; we are focusing our interventions on operators of online services where there is information which indicates potential poor compliance with privacy requirements, and where there is a high risk of potential harm to children,” Denham also wrote in reply to the 5Rights Foundation, adding that as part of this work the ICO has contacted Apple and Google — “to enquire about the extent to which the risks associated with the processing of personal data are a factor when determining the age rating for an app”.

The tech giants have been contacted for comment on the development.

Both operate app stores which apply age ratings to apps that are made available for download — potentially by children — meaning their platforms come under the scope of the Code.

The ICO is not singling out Apple and Google, however — saying today that it’s written to a total of 40 organizations across the three tech sectors it considers highest risk for kids — namely social media/messaging; gaming; and video/music streaming — “to determine their standards of conformance individually”.

It adds that it will write to a further nine companies following the charity highlighting a raft of concerns — bringing the total number of digital services under regulatory review to almost 50.

The ICO has not published the full list of tech companies it has targeted for Code compliance questions.

Nor does the 5Rights Foundation appear to have published the list of companies it’s raising concerns about (but it has passed their names to the regulator).

Its correspondence to the ICO also refers directly to Facebook whistleblower, Frances Haugen’s recent testimony to lawmakers, which includes warnings over the toxicity of platforms like Instagram for teens’ developing brains — suggesting the social media giant (now known as Meta) is likely on its list.

The charity’s chair, Baroness Kidron, was instrumental in pushing for the Code’s set of headline standards to be established by the ICO in the first place, as part of the UK’s existing data protection legislation.

Although the Code only came into force at the start of September the standards were published at the start of last year — with the ICO opting to give business a long grace period to come into compliance.

So, in one sense, it’ll be difficult for established businesses to argue that they haven’t had enough time to make the necessary changes.

Moreover, while the ICO doesn’t exactly have a reputation as a pro-active enforcer of digital rights, child safety groups like the 5Rights Foundation don’t look like they’ll be content to let enforcement sit still for long.

“There is a danger that the Code is being interpreted as introducing a handful of safety measures, rather than a requiring a holistic re-design of the systems and processes of services to ensure their data collection practices are in the best interests of children,” the 5Rights Foundation warns in its letter. “If the Code is to have real value in protecting children’s safety and rights in the digital environment, the ICO must make sure that it is respected in practice.”

Kidron also suggests that the “systemic” nature of the problems the 5Rights Foundation’s research unearthed suggests that while the issues were identified ahead of the Code coming into force this fall, “many” will likely persist — hence it’s urging the ICO to investigate “apparent breaches” and publish guidance.

In her response, Denham suggests a timeline of next spring for the ICO to take some sort of action, writing: “In terms of timescales, we need to take the time to understand what the information gathered is telling us systemically and individually. Our regulatory options will be based on that careful understanding and as such I expect that we will progress to next steps in spring 2022.”

But it’s notable that she has written “next steps” — rather than actual enforcement.

This suggests there may be a far longer dance of the ICO trying to ‘encourage’ improvements from the tech industry vs muscular enforcement on platform giants — as we’ve seen in the case of systemic breaches of data protection law by the adtech industry (which the ICO has been ‘investigating’ for years, just without taking any enforcement action).

See, for example, the discussion later in Denham’s letter to Kidron of “stakeholder roundtable events” where she says it’s planning to gather evidence about the use of age assurance tech, specifically, that will be “used to inform the scope of any further regulatory action in relation to age assurance”.

On age assurance — aka, technologies and techniques used to try to determine the age of a user to figure out if they are underage/a minor — the 5Rights Foundation’s research found “many” services with age restrictions that it said can, nonetheless, “be easily accessed by children under the minimum age of use, including adult-only services”.

It also reports finding some services stating that they do not collect any personal data from children — yet says “many” of these lacked age assurance or else used age assurance that can be easily bypassed (such as asking children to input a birth date, which they can just lie about).

“If these services do not identify child users, it is unclear how they are upholding their own privacy policies or are able to implement the Code,” the charity goes on to warn in the letter.

However the question of how platforms can comply with elements of the cCode like ‘age assurance’ requirements is not exactly a straightforward one.

Last month the ICO put out a call for evidence on age assurance. And in an opinion on the topic, published simultaneously, the regulator offered some tentative guidance for digital service providers — recommending a risk-based approach and suggesting those platforms and apps that pose a “high” risk to children should either apply all relevant code standards to all users to ensure risks are “mitigated”; or introduce age assurance measures that “give the highest possible level of certainty on age of users” (likely to mean age verification rather than age estimation).

For low or medium risk services, the ICO suggested either applying all relevant code standards to all users to ensure risks are “low”; or “introduce age assurance measures that give a level of certainty on the age of child users that is proportionate to the potential risks to children”.

So there’s inherent subjectivity in how platforms assess risks and choose which mitigating measures to apply.

The opinion also highlighted the challenge for digital services in balancing the requirement to protect privacy with applying (potentially) intrusive age assurance techniques, with the ICO writing: “While the Commissioner appreciates the developments in age assurance techniques, technology and policy, more needs to be done to ensure these respect and comply with data protection law.”

The regulator also committed to revisiting the opinion in line with a planned review of the Code in 2022 — “due to the rapidly evolving state of the age assurance market, wider legislative proposals and developing policy landscape”.

Expecting muscular enforcement to rain down in an area where privacy and safety intersect — and even collide — and where there is no simple one-size-fits all solution that will be appropriate everywhere, for every type of service and user — seems, well, unlikely.

Denham’s lengthy letter is not only packed with caveats and qualifications it begins by managing expectations — framing the Code as seeking to drive “proportionate protections that enhance society’s engagement with the digital world” — all of which suggests her office will favor a more pragmatic, tweak-by-tweak, approach to compliance with the Code than some child safety charities (and age assurance tech providers) might prefer.

The UK government is also in the process of consulting on ‘reforming’ the domestic data protection regime — potentially interfering with the independence of the ICO in favor of prioritizing data-fuelled ‘innovation’. So the ICO may get defanged and domestic privacy rights gutted in the not too distant future.

Time will tell how this one plays out. But the UK’s child safety campaigners may end up finding themselves feeling as frustrated as UK privacy advocates — who continue to whistle for regulatory enforcement against systemic breaches (that the ICO has itself identified) years after the country updated its data protection regime to add (at least on paper) a set of teeth…

“I hope you will recognise that as a regulator, the ICO will always face tough choices on how to deploy our limited resources,” Denham cautions Kidron. “As such, this is why our initial focus is on those cases of greatest potential harm with non-conformance across multiple standards.”

She also warns that the ICO will be staying in its lane — such as by not applying the Code’s standards retrospectively.

Denham’s letter is also careful to emphasize that the regulator cannot tackle certain concerns raised by the charity as they don’t fall under its remit or else fall outside the scope of the Code (such as child accessing adult-only websites) — pointing instead to the forthcoming Online Safety Act as the appropriate legislation for that; a content-focused regulation that will be overseen by Ofcom, not the ICO.

“The ICO will continue to work with DCMS [the Department for Digital, Culture, Media and Sport], Ofcom as the intended online safety regulator, and others to ensure that where we can act under current regulation, we do try to prevent underage access. However the solution to the problem is not one that sits squarely within the code or within data protection so is not one the ICO can commit to address entirely,” she adds.

Products You May Like

Articles You May Like

Trump’s pro-fracking energy secretary pick has also invested in geothermal and nuclear startups
Battery unicorn Northvolt files for bankruptcy, upending Europe’s industrial plan
Itching to write a book? AI publisher Spines wants to make a deal
Tubi hops on the short-form video bandwagon with its ‘Scenes’ feature
FTX CTO Gary Wang avoids prison time

Leave a Reply

Your email address will not be published. Required fields are marked *