Information about Children is being misused by popular children’s apps

July 22, 2022 |

There is increasing concern about personal information of children being scraped from the net or collected through websites and apps.  In May the US Federal Trade Commission announced that it was cracking down on companies that illegally surveil children on line.  Earlier this year the FTC took action against Weight Watchers for illegally collecting children’s health information. In March the US District Court for the Northern District of Illinois a $1.1 million to resolve an action where Tik Tok was alleged to have collected children’s data and sold it third parties. In the United States it has been estimated that by a child is 13 on line advertising firms have collected an average of 72 million data points about that person.  In the USA the gaps and loopholes in the privacy laws allowed apps to track kids phones.

The organisation Children and Media Australia has released a report highlighting how many games apps collect childrens’ data.  That has been covered by the ABC in How some of the most-popular children’s apps are sharing data in ‘troubling’ ways , 

The ABC article provides:

Parents are being warned that some of Australia’s most popular entertainment apps for children contain risky code that allows companies to gather data on children and build profiles that can follow them for life.

The findings come from an audit of 186 Android entertainment apps by Children and Media Australia (CMA), which liaised with a leading data and security expert in San Francisco to test the apps.

Despite international treaties on children’s privacy, it is now estimated that 72 million pieces of personal data will be collected on every child around the world before their 13th birthday.

CMA engaged Serge Egelman — research director of Usable Security and Privacy Research at the prestigious University of California, Berkeley — to test the most-popular children’s entertainment products over a nine-month period concluding in March this year.

Mr Egelman found 59 per cent — or 101 of the 186 tested — contained concerning code.

He said the audit revealed a number of practices that went beyond advertising, and was concerned the sensitive information of young Australians was in the hands of corporate entities with limited independent oversight.

“This data can be used for many other purposes, too, many of which probably haven’t been invented yet,” he said.

“That’s problematic when it’s coming from children who probably are even less aware that this is occurring than adults [would be].”

The burgeoning understanding of the risks — which the ABC has previously revealed can include targeting children with age-inappropriate advertisements for tobacco or alcohol — led CMA to expand its parameters to include mobile apps. 

CMA broke down these into three categories, from those of “concern” right through to “very risky”.

Seven apps were designated “very risky” because they collected multiple “identifiers” about children, making it very difficult for their parents to opt out. The apps were also detected transmitting the children’s data insecurely, in ways that made the data susceptible to hacking.

“This is transmitting various identifiers without taking proper security precautions, such as using encryption,” Mr Egelman said.

These included popular games that have clocked as many as 100 million downloads, including Star Wars: Pinball 7, and Dr Panda’s Swimming Pool, which its parent company says is aimed at children younger than five years.

The parent companies of both Dr Panda’s Swimming Pool, based in China, and Star Wars: Pinball 7, based in Hungary, did not respond to requests for comment.

‘Risky apps’

The next category are apps that collect multiple “identifiers”, a practice Google and Apple have discouraged since 2014 because it means the limited privacy protections currently available to consumers can be circumvented.

In 2014, Google introduced an Advertising ID (AAID) for all smartphone users which is unique to the device.

The ID helped companies create profiles of users for advertising, but the consumer could reset it if they were concerned about privacy. The reset was supposed to mean they appeared as a new user the next time they returned to the website or app.

However, if a company used a second identifier — such as an Android ID (AID) — the practice is called “ID bridging”, and it means that, even with a reset, the company can identify the app user.

Experts say this is particularly risky because it gives companies the ability to build a lifelong advertising profile, with no way to opt out.

“We see this kind of behaviour a lot at the moment, despite it being prohibited. That’s not to minimise its importance, however,” Mr Egelman said.

“Ultimately, companies that are collecting this data need to make sure these identifiers are not combined in such a way that they can de-anonymise people.

“For that reason, collecting them together is bad practice, since they are likely to wind up being stored in the same place.”

ABC Kids was among the apps detected collecting users’ AAID and AID. Unlike others though, they were not used for advertising nor shared with third parties.

The ABC — which has been under pressure over its privacy policies since mandating iview logins — said it stopped collecting both identifiers earlier this year, and disputed the CMA’s classification of ABC Kids as a “risky” app for parents.

“Prior to April 2022, the ABC Kids app used Google Android Advertising Id (AdID) as an identifier for devices only, not as an identifier for users. This information was held for internal analytical use and only the ABC had access to the data,” an ABC spokesman said.

“In March 2022, Google AAID was removed from the app and, in April 2022, AID was removed, preventing any access on the ABC Kids app to these identifiers.

“The way this information was collected, stored and used did not identify any individual user and with the IDs no longer accessed by the app at all, any risk of identification using the IDs was completely removed.”

The data that was collected is now scheduled to be deleted and the ABC insists it is a world leader in privacy protection.

Most of the other companies identified in the “risky” category contain Software Development Kits (SDks) that allow third-party advertisers the same access to multiple identifiers.

They include TutoToons, which owns the popular Animal Hair Salon Australia as well as its hit global brands.

The company stopped collecting multiple identifiers the same day it was approached for comment by the ABC.

“Being a company that makes digital products specifically for kids, compliance and child safety online is our main priority. We are aware of the constantly changing regulations and are currently updating our products to adhere to them,” a spokesperson said.

The company said it also vetted companies that received kids’ data and would not do business with those creating profiles.

“There are plenty of bad actors in the kid’s space, but companies that are assessed and certified are the ones that want to protect children and have put time, effort and resource into doing so,” a spokesperson said.

Apps of concern

The final category relates to apps with SDKs in their code that often allow multiple tech giants access to users for advertising.

This includes the popular Pepi Play and My Town gaming franchises.

When approached by the ABC, Pepi Play said it would immediately endeavour to remove the offending SDKs from its code and called for a broader, industry-wide response.

“Tracking permission requests on Android devices, similar to those seen on Apple iOS devices, would be a welcome change in light of this report,” a spokesperson said.

My Town said it only used SDKs that Google had approved for use in children’s products and would engage with CMA when its report is publicly released.

“If there are any issues then, for sure, we would address them and report the matter to Google so they can disallow the SDK and the company behind it from entering our app space,” a spokesperson said.

The bigger picture

Australia’s protections for the trade and use of children’s data have not kept pace with technology.

A review of the Privacy Act began under the previous government and CMA wants explicit regulation so that parents can be confident that children’s information is protected.

“It’s all the more troubling when it comes to children because children don’t have the capacity to defend themselves against that kind of thing,” media law specialist and the president of CMA, Elizabeth Handsley, said.

CMA is also seeking funding to pay for an annual audit, arguing parents need information about which apps to trust in a world where data is increasingly valuable.

“We just don’t know what the world is going to look like in 10 or 20 or 30 years and how this kind of information could be used,” Professor Handsley said.

Leave a Reply

Verified by MonsterInsights