Children have their data collected without their knowledge or consent. Photo: Shutterstock

Unscrupulous mobile app developers are collecting personal data from Australian children under the age of five, a privacy watchdog group has warned, after a nine-month analysis of 186 popular gaming apps found 59 % of them had “problematic data collection behavior”.

As part of a campaign called Apps Can Track, the technical analysis – commissioned by Children and Media Australia (CMA) with the assistance of Dr Serge Engleman, Research Director of the Usable Security and Privacy Group within the International University of California, Berkeley’s Computer Science Institute (ICSI) – used ICSI media AppCensus tool for examine the data collection practices of 208 Android apps.

Of these, 22 were specifically designed to facilitate school-home or parent-child communication and the remaining 186 were popular games – 54 of which were classified as engaging in “risky” privacy practices.

This meant that despite being targeted at very young children – think Dr Panda’s swimming pool – the apps collect and send unique Android ID (AID) and Android Advertising ID (AAID) codes to up to six advertising-related companies.

40 other applications, including Ice cream cone cupcake makerwere flagged as requiring precautions because they passed the AAID on to at least one advertising-related company.

And seven apps, including Star Wars Pinball 7 – have been classified as “very risky” because they transmit personal data to advertising-related companies “in an insecure manner”.

AID and AAID codes allow advertisers to identify individual mobile handsets and track user behavior between applications – behavior that has been strictly regulated for children under 13 since the United States adopted the Children’s Online Privacy Protection Act (COPPA) in 1998.

The lack of legislation to protect Australian children prompted the CMA, which separately saw again the content of over 1,500 movies and 990 apps since 2002, to expand its mandate to include privacy protection.

The results of the project are integrated into the growth of CMA database of mobile app privacy reviews, which the organization has been developing since June 2021 and now includes 208 app reviews.

CMA’s goal is “to help parents who may not be aware of the risks associated with the apps their children are playing, nor may they be able to detect or prevent them,” the organization said, warning that Commonly used privacy practices create “an extensive digital footprint that can be used by predators and marketers, and can compromise later life prospects such as jobs, college places, health insurance and more.

“Big Tech’s business model is to collect user data to sell them things, and it won’t be easy to change your mind.”

The clock is TikTok-ing

Detailed policies of Apple and Google set clear expectations about what data apps will and won’t collect, but developers have repeatedly been caught ignoring these restrictions and collecting far more user data than they’re supposed to – or have need.

Earlier this year, for example, Google banned more than a dozen apps after AppCensus analysis identified secret data collection practices.

In March, the United States Federal Trade Commission a fine weight loss company WW International (formerly Weight Watchers) and its subsidiary Kurbo, Inc. $2.2 million ($1.5 million) for illegally collecting data on children as young as eight years old without the consent of their parents.

Pervasive remote learning in the age of the pandemic has upped the ante, with Human Rights Watch (HRW) last year reports that 89% of the 163 EdTech products analyzed “directly endanger or violate children’s privacy and the rights of other children, for purposes unrelated to their education”.

The products “monitored or had the ability to monitor children, in most cases covertly and without the consent of the children or their parents”, found HRW – with most online learning platforms using tracking technologies that “followed the children outside of their virtual classrooms and around the world.” internet, over time.

Almost all of the apps share children’s data with third parties, HRW found, warning that “in doing so, they appear to have enabled sophisticated algorithms from AdTech companies to stitch together and analyze this data to guess a person’s personal characteristics. child and interests, and to predict what a child might do next and how they might be influenced.

The gaping gap between the theoretical protection of children and the practice of app developers has sounded the alarm at the highest level.

Earlier this month, US Federal Communications Commission (FCC) Commissioner Brendan Carr called on Apple and Google to remove the ubiquitous TikTok app from their app stores after investigations suggested that it was a “sophisticated surveillance tool that collects large amounts of personal and sensitive data”.

For now, the CMA is focused on advocating for the “urgent need for effective regulatory protections,” pushing developers to use “child-centric safety by design” and empowering parents, including securing additional funding to make AppCensus results even more accessible. non-technical parents.