Call for privacy controls on Tik Tock
July 27, 2021 |
In today’s Age the National Children’s Commissioner in TikTok: Time’s up to protect children’s privacy highlights the alarming privacy invasive practices of Tik Tok as well as the cumulative data collecting on children through social media and other sources. While the impetus of the story was on Tik Tok’s focus on children there is not much new to Anne Hollands’ piece. Social media sites have been in the business of collecting personal information since their inception. Google’s business model is predicated on collecting and aggregating data through alogorithms so as to sell targeted advertising.
Hollands’ concern about Tik Tok and other sites collecting personal information without proper consent is well placed. The ACCC has similar concerns. The potential problem is part of her solution, to have provisions in the Privacy Act requiring anyone collecting children’s data to have some form of best interests of children provision relating to the collection and use of that data. The problem with this approach is that it creates additional protections for specific types of data. The resulting danger is that there will be silos of strong protection amidst weak protection overall. That is what happens in the United States of America. There the Children’s Online Privacy Protection Act (“COPPA”). COPPA sets stringent requirements on websites or services directed at children, strong health records protections with the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and even protections over records of video renting with the Video Privacy Protection Act of 1988. But many other areas of activity in the USA have weak privacy protections at the Federal level.
The chronic problem is weak privacy protections across the board, irrespective of age. As a society we instinctively want to protect our children from all harms. But there is a societal need for everyone’s personal information to be better protected both in terms of legislation but also enforcement. At the moment the Privacy Act has numerous exemptions in terms of organisations covered and carve outs in the Australian Privacy Principles for law enforcement and other agencies and businesses in vaguely defined circumstances.
Hollands refers to the Department of Justice reviewing the Privacy Act. It has been a lackadaisical exercise and there is little prospect of it being completed by May next year, probably the last date for a Federal election campaign. Even then an exposure draft of a Bill will need to be circulated and then it needs to be introduced and debated and then enacted. Assuming a relatively smooth passage through each stage only a starry eyed optimist would expect amendments to the Privacy Act to be enacted and come into effect in 2022. More likely 2023. And all of that presupposes that the review will produce amendments will turn the Privacy Act into effective legislation that will properly protect the collection of personal information. None of that even comes close to improving the operation of the Information Commissioner’s office.
So while Hollands makes good points and her call to arms is to be congratulated this is a familiar scene and her proposed solution is not particularly effective unless it is a small part of a comprehensive overhaul which results in real protections.
The Age article provides:
Even before birth, Australian children are the targets of technology that collects their data and threatens their privacy – but right now we have an opportunity to protect them and future generations.
Parents who use pregnancy apps or share ultrasounds on social media can expect information about their children to be collected and sold to advertisers for profit. Once a child is born, baby monitors enabled by artificial intelligence (AI) and web-connected toys collect data from the cot. One leading expert, Donell Holloway, estimates that by a child’s 13th birthday, advertisers will have gathered on average more than 72 million data points about them.
This data powers digital advertising that capitalises on information about peoples’ lives, habits and interests. When much of this information is collected by devices in the seclusion of bedrooms or living rooms, our children’s right to safety and privacy is severely threatened.
The impact of this surveillance becomes sharper as children enter adolescence and their data is used to create personalised content recommendations and advertising profiles. Young people who display curiosity about alcohol, gambling or pornography, for instance, are served content designed to fuel those interests. And algorithms can reinforce harmful racial stereotypes or perpetuate troubling views about women.
Last night’s Four Corners program investigated how the video sharing app TikTok preys on young users. TikTok presents an endless stream of short videos that viewers do not select, but which appear automatically as they scroll.
It means that without any active selection, young people may be shown videos that are highly sexualised, endorse drug use, or are otherwise inappropriate. Four Corners interviewed one young woman whose eating disorder was exacerbated after being shown videos about dieting and weight loss.
Although the app ostensibly has a minimum user age of 13, children under the age of 12 are one of its two largest audiences – the other being young people in their teens and early 20s.
Like other social media platforms, TikTok collects a great deal of personal information, including phone numbers, videos, exact locations and biometric data. This is done without sufficient warning, transparency or meaningful consent – and without children or parents knowing how the information is used.
The former children’s commissioner for England, Anne Longfield, is suing TikTok on behalf of millions of children in Britain who have downloaded the app, alleging their data was collected and used illegally.The Australian government is currently reviewing the Privacy Act (1988), which governs the collection and storage of personal information. There is also legislation currently being drafted and will soon be available for public consultation, which will focus specifically on social media platforms. We must grasp these opportunities to tighten protections for the collection and use of personal data, particularly of children.
Australia should follow the examples of the UK and Ireland. Both countries are implementing a “best interests” by default principle, which requires anyone collecting or using children’s data to do so in ways that benefit the child. This principle already exists in Australian family law and other policy areas. Reforming privacy legislation to require upfront protection of the “best interests of children” in the collection and use of data would help keep all children safe.
In 2019 Christian Porter, who was Attorney General at the time, announced the Government’s amendments to the Privacy Act would result in a code for tech companies that trade in personal information. He said, “The code will require these companies to be more transparent about any data sharing and require more specific consent of users when they collect, use and disclose personal information.”
Such a code has not yet been developed – but it could help protect children by ensuring that companies only collect data they need to run their service, and that data must not be used for other purposes. It could require companies to turn off personalised advertising to children as a default and display terms and conditions in simple, child-friendly language. The code could also mandate an eraser button that enables children to easily delete any data that has been collected about them.
As well as amending the Privacy Act, governments at all levels must also implement recommendations from the Australian Human Rights Commission’s recent Human Rights and Technology Report, including tighter regulation and oversight of corporate AI processes to ensure they do not impact human rights.
Big Data has the potential to benefit children, but the reality is that it can also create serious harm throughout their lifetimes. Australian governments must take responsibility for ensuring data is used ethically for all citizens. They must act to protect children’s safety and privacy, and ensure young people are not exploited by companies that profit from information about their lives, habits and interests.