FTC commences an action against Tik Tok and Byte Dance for violating Children’s Privacy Law and against Tik Tok for infringing an existing consent order
August 6, 2024 |
The FTC, through the Department of Justice, has commenced an action against the video-sharing platform TikTok, and its parent company ByteDance,alleging that they flagrantly violating Children’s Online Privacy Protection Act. The FTC also alleges Tick Tok infringed an existing FTC 2019 consent order against TikTok for violating COPPA shortly after it went into effect. The FTC also allege that two TikTok entities (previously Musical.ly and Musical.ly Inc., which ByteDance acquired in 2017 and renamed) agreed to the terms of the order to settle allegations that they violated the COPPA Rule by unlawfully collecting personal information from children under the age of 13.
The complaint alleges defendants failed to comply with the COPPA requirement to notify and obtain parental consent before collecting and using personal information from children under the age of 13.
The Press Release provides:
On behalf of the Federal Trade Commission, the Department of Justice sued video-sharing platform TikTok, its parent company ByteDance, as well as its affiliated companies, with flagrantly violating a children’s privacy law—the Children’s Online Privacy Protection Act—and also alleged they infringed an existing FTC 2019 consent order against TikTok for violating COPPA.
The complaint alleges defendants failed to comply with the COPPA requirement to notify and obtain parental consent before collecting and using personal information from children under the age of 13.
“TikTok knowingly and repeatedly violated kids’ privacy, threatening the safety of millions of children across the country,” said FTC Chair Lina M. Khan. “The FTC will continue to use the full scope of its authorities to protect children online—especially as firms deploy increasingly sophisticated digital tools to surveil kids and profit from their data.”
“The Justice Department is committed to upholding parents’ ability to protect their children’s privacy,” said Principal Deputy Assistant Attorney General Brian Boynton. “This action is necessary to prevent the defendants, who are repeat offenders and operate on a massive scale, from collecting and using young children’s private information without any parental consent or control.”
ByteDance and its related companies allegedly were aware of the need to comply with the COPPA Rule and the 2019 consent order and knew about TikTok’s compliance failures that put children’s data and privacy at risk. Instead of complying, ByteDance and TikTok spent years knowingly allowing millions of children under 13 on their platform designated for users 13 years and older in violation of COPPA, according to the complaint.
As of 2020, TikTok had a policy of maintaining accounts of children that it knew were under 13 unless the child made an explicit admission of age and other rigid conditions were met, according to the complaint. TikTok human reviewers allegedly spent an average of only five to seven seconds reviewing each account to make their determination of whether the account belonged to a child.
The company allegedly continued to collect personal data from these underage users, including data that enabled TikTok to target advertising to them—without notifying their parents and obtaining their consent as required by the COPPA Rule. Even after it reportedly changed its policy not to require an explicit admission of age, TikTok still continued to unlawfully maintain and use personal information of children, according to the complaint.
TikTok’s practices prompted its own employees to raise concerns. As alleged, after failing to delete numerous underage child accounts, one compliance employee noted, “We can get in trouble … because of COPPA.”
In addition, the complaint alleges that TikTok built back doors into its platform that allowed children to bypass the age gate aimed at screening children under 13. TikTok allegedly allowed children to create accounts without having to provide their age or obtain parental consent to use TikTok by using credentials from third-party services like Google and Instagram. TikTok classified such accounts as “age unknown” accounts, which grew to millions of accounts, according to the complaint.
Even when it directed children to use the TikTok Kids Mode service, a more protected version for kids, the complaint charges that TikTok collected and used their personal information in violation of COPPA. It also alleges that TikTok collected numerous categories of information and far more data than it needed, such as information about children’s activities on the app and multiple types of persistent identifiers, which it used to build profiles on children, while failing to notify parents about the full extent of its data collection and use practices. For example, TikTok shared this personal data with third parties such as Facebook and AppsFlyer to persuade existing Kids Mode users to use the service more after their use had declined or ceased, through a practice TikTok called “retargeting less active users,” according to the complaint.
TikTok also allegedly made it difficult for parents to request that their child’s accounts be deleted. When parents managed to navigate the multiple steps required to submit a deletion request, TikTok often failed to comply with those requests. TikTok also imposed unnecessary and duplicative hurdles for parents seeking to have their children’s data deleted. That practice allegedly continued even after the executive responsible for child safety issues told TikTok’s then-CEO, “we already have all the info that’s needed” to delete a child’s data when a parent requests it, yet TikTok would not delete it unless the parent fills out a second, duplicative form. If the parent did not do that, the executive allegedly added, “then we have actual knowledge of underage user[s] and took no action!”
The complaint also claimed that TikTok began violating the terms of the 2019 FTC order shortly after it went into effect. Two TikTok entities (previously Musical.ly and Musical.ly Inc., which ByteDance acquired in 2017 and renamed) agreed to the terms of the order to settle allegations that they violated the COPPA Rule by unlawfully collecting personal information from children under the age of 13.
Additionally, the complaint alleges that TikTok failed to:
-
- notify parents about all of the personal data they were collecting from children;
- obtain parental consent for the collection and use of that data;
- limit the collection, use, and disclosure of children’s personal information; and
- delete children’s personal information when requested by parents or when it was no longer needed.
The complaint asks the court to impose civil penalties against ByteDance and TikTok and to enter a permanent injunction against them to prevent future violations of COPPA. The FTC Act allows civil penalties up to $51,744 per violation, per day.
The 2019 Agreement relevantly provides:
I. INJUNCTION CONCERNING THE COLLECTION OF PERSONAL INFORMATION FROM CHILDREN
IT IS ORDERED that Defendants and Defendants’ officers, agents, employees, and attorneys, and all other persons in active concert or participation with any of them, who receive actual notice of this Order, whether acting directly or indirectly, in connection with being an operator of any Web site or online service directed to children or of any Web site or online service with actual knowledge that it is collecting or maintaining personal information from a child, are hereby permanently restrained and enjoined from violating the Children’s Online Privacy Protection Rule, 16 C.F.R. Part 312, including, but not limited to:
A. Failing to make reasonable efforts, taking into account available technology, to ensure that a parent of a child receives direct notice of Defendants’ practices with regard to the collection, use, or disclosure of personal information from children, including notice of any material change in the collection, use, or disclosure practices to which the parent has previously consented;
B. Failing to post a prominent and clearly labeled link to an online notice of its information practices with regard to children, if any, on the home or landing page or screen of its Web site or online service, and at each area of the Web site or online service where personal information is collected from children;
C. Failing to obtain verifiable parental consent before any collection, use, or disclosure of personal information from children, including consent to any material change in the collection, use, or disclosure practices to which the parent has previously consented;
D. Failing to delete a child’s personal information at the request of a parent; and
E. Retaining personal information for longer than is reasonably necessary to fulfill the purpose for which the information was collected.
II. DELETION OF CHILDREN’S PERSONAL INFORMATION AND TREATMENT OF ACCOUNTS EXISTING AT TIME OF ENTRY OF THIS ORDER
IT IS FURTHER ORDERED that Defendants and Defendants’ officers, agents, employees, and attorneys, and all other persons in active concert or participation with any of them, who receive actual notice of this Order, shall:
A. Destroy all personal information, in all forms in their possession, custody, or control, that is associated with user accounts existing at the time of entry of this Order; or
B. If, at the time of entry of this Order, Defendants operate any Web site or online service that is directed to children but that does not target children as the primary audience,
1. For users of accounts existing at the time of entry of this Order who identify as under age 13, Defendants shall destroy such users’ personal information, except that Defendants may, with the affirmative consent of the user, transfer the user’s videos to the user’s device and allow such user to retain their username, so long as that username does not function in the same manner as online contact information.
2. For users of accounts existing at the time of entry of this Order who identify as age 13 or over who were under age 13 at the time Defendants collected personal information, Defendants shall destroy such personal information, except that Defendants may, with the affirmative consent of the user, transfer the user’s videos to the user’s device and retain the user’s registration information.
3. If the age of a particular user of an existing account is not identified within forty-five 45 days after entry of the Order, Defendants shall, within forty-five (45) days of entry of the Order:
(a) remove such user’s personal information from Defendants’ Web sites and online services;
(b) refrain from disclosing or using personal information that has been removed from their Web sites and online services; and
(c) destroy such personal information within 12 months after entry of the Order. If the age of a particular user whose information has been removed is identified within 12 months after entry of the Order, Defendants shall comply with Section B(1) and B(2).
4. Personal information need not be destroyed, and may be collected, used, and disclosed, to the extent Defendants obtain verifiable parental consent for its collection, use, and disclosure.
5. Personal information need not be destroyed, and may be disclosed, to the extent requested by a government agency or as required by a law, regulation, or court order.
C. To the extent not covered in its compliance report, Defendants must submit a supplemental report, sworn under penalty of perjury, discussing whether and how Defendants are in compliance with this provision, within 15 months of entry of this Order.
Provided further, that personal information need not be destroyed, and may be collected, used, and disclosed, to the extent Defendants obtain verifiable parental consent for its collection, use, and disclosure, and may be disclosed as requested by a government agency or as required by a law, regulation or court order.
III. MONETARY JUDGEMENT FOR CIVIL PENALTY
IT IS FURTHER ORDERED that:
A. Judgement in the amount of $5,700,000.00 is entered in favor of Plaintiff against Defendants, jointly and severally, as a civil penalty.
B. Defendants are ordered to pay to Plaintiff, by making payment to the Treasurer of the United States, $5,700,000.00, which, as Defendants stipulate, their undersigned counsel holds in escrow for no purpose other than payment to Plaintiff. Such payment must be made within 7 days of entry of this Order by electronic fund transfer in accordance with instructions previously provided by
a representative of Plaintiff.
IV. ADDITIONAL MONETARY PROVISIONS
IT IS FURTHER ORDERED that:
A. Defendants relinquish dominion and all legal and equitable right, title, and interest in all assets transferred pursuant to this Order and may not seek the return of any assets.
B. The facts alleged in the Complaint will be taken as true, without further proof, in any subsequent civil litigation by or on behalf of the Commission, including in a proceeding to enforce its rights to any payment or monetary judgement pursuant to this Order.
C. The facts alleged in the Complaint establish all elements necessary to sustain an action by the Commission pursuant to Section 523(a)(2)(A) of the Bankruptcy Code, 11 U.S.C. § 523(a)(2)(A), and this Order will have collateral estoppel effect for such purposes.
D. Defendants acknowledge that their Taxpayer Identification Numbers, which Defendants must submit to the Commission, may be used for collecting and reporting on any delinquent amount arising out of this Order, in accordance with 31 U.S.C. §7701.
V. INFORMATIONAL RELIEF
A. Defendants must report on their deletion obligations under penalty of perjury:
1. Defendants must submit a report within ninety (90) days of the entry of this Order summarizing their compliance with Section II of this Order; and
2. If Defendants elect to operate any Web site or online service that is directed to children but that does not target children as the primary audience, Defendants must submit a report within fifteen (15) months of the entry of this Order summarizing their compliance with Section II.B. of this Order.
VI. ORDER ACKNOWLEDGMENTS
IT IS FURTHER ORDERED that Defendants obtain acknowledgments of receipt of this Order:
A. Each Defendant, within 7 days of entry of this Order, must submit to the Commission an acknowledgment of receipt of this Order sworn under penalty of perjury.
B. For five (5) years after entry of this Order, Defendants must deliver a copy of this Order to: (1) all principals, officers, directors, and LLC managers and members; (2) all employees, agents, and representatives having managerial responsibilities for the collection, use, maintenance, or disclosure of personal information or the operation of any of Defendants’ Web sites or online services; and (3) any business entity resulting from any change in structure as set forth in the Part titled Compliance Reporting. Delivery must occur within seven (7) days of entry of this Order for current personnel. For all others, delivery must occur before they assume their responsibilities.
C. From each individual or entity to which a Defendant delivered a copy of this Order, that Defendant must obtain, within 30 days, a signed and dated acknowledgment of receipt of this Order.
VII. COMPLIANCE REPORTING
IT IS FURTHER ORDERED that Defendants make timely submissions to the Commission:
A. One year after entry of this Order, each Defendant must submit a compliance report, sworn under penalty of perjury. Each Defendant must:
1. Identify the primary physical, postal, and email address and telephone number, as designated points of contact, which representatives of the Commission and Plaintiff may use to communicate with Defendant;
2. Identify all of that Defendant’s businesses by all of their names, telephone numbers, and physical, postal, email, and Internet addresses;
3. Describe the activities of each business, including the goods and services offered, the means of advertising, marketing, and sales, and involvement of any other Defendant;
4. Describe in detail whether and how that Defendant is in compliance with each section of this Order;
5. Provide a copy of each different version of any privacy notice posted on each Web site or online service operated by that Defendant or otherwise communicated to parents of children from whom that Defendant collects personal information;
6. Provide a statement setting forth in detail any methods used to obtain verifiable parental consent prior to any collection, use, and/or disclosure of personal information from children or the methods used to avoid collecting, using, and/or disclosing personal information from children;
7. Provide a statement setting forth in detail the means provided for parents to review any personal information collected from their children and to refuse to permit its further use or maintenance;
8. Provide a statement setting forth in detail why each type of information collected from a child is reasonably necessary for the provision of the particular related activity;
9. Provide a statement setting forth in detail the procedures used to protect the confidentiality, security, and integrity of personal information collected from children; and
10. Provide a copy of each Order Acknowledgment obtained pursuant to this Order, unless previously submitted to the Commission.
B. For ten (10) years after entry of this Order, each Defendant must submit a compliance notice, sworn under penalty of perjury, within 14 days of any change in: (a) any designated point of contact; or (b) the structure of any Defendant or any entity that Defendant has any ownership interest in or
controls directly or indirectly that may affect compliance obligations arising under this Order including: creation, merger, sale, or dissolution of the entity or any subsidiary, parent, or affiliate that engages in any acts or practices subject to this Order.
C. Each Defendant must submit to the Commission notice of the filing of any bankruptcy petition, insolvency proceeding, or similar proceeding by or against such Defendant within 14 days of its filing.
D. Any submission to the Commission required by this Order to be sworn under penalty of perjury must be true and accurate and comply with 28 U.S.C. § 1746, such as by concluding: “I declare under penalty of perjury under the laws of the United States of America that the foregoing is true and correct.
The complaint the FTC issued in the United States District Court relevantly provides:
NATURE OF THE CASE
1. Defendants operate TikTok, one of the world’s largest online social media platforms. TikTok collects, stores, and processes vast amounts of data from its users, who include millions of American children younger than 13.
2. For years, Defendants have knowingly allowed children under 13 to create and use TikTok accounts without their parents’ knowledge or consent, have collected extensive data from those children, and have failed to comply with parents’ requests to delete their children’s accounts and personal information.
3. Defendants’ conduct violates the Children’s Online Privacy Protection Act of 1998 (“COPPA”) and Children’s Online Privacy Protection Rule (“Rule” or “COPPA Rule”), a fe deral statute and regulations that protect children’s privacy and safety online. It also defies an order that this Court entered in 2019 to resolve a lawsuit in which the United States alleged that TikTok Inc.’s and TikTok Ltd.’s predecessor companies similarly violated COPPA and the COPPA Rule by allowing children to create and access accounts without their parents’ knowledge or consent, collecting data from those children, and failing to comply with parents’ requests to delete their children’s accounts and information.
4. To put an end to TikTok’s unlawful massive-scale invasions of children’s privacy, the United States brings this lawsuit seeking injunctive relief, civil penalties, and other relief.
JURISDICTION AND VENUE
5. This Court has subject matter jurisdiction pursuant to 28 U.S.C. §§ 1331, 1337(a), 1345, and 1355.
6. Venue is proper in this District under 28 U.S.C. §§ 1391(b)(2), (b)(3),
c)(1), (c)(2), (c)(3), and (d), 1395(a), and 15 U.S.C. § 53(b).
PLAINTIFF
7. Plaintiff is the Unit ed States of America. Plaintiff brings this action or violations of Section 5(a) of the FTC Act, 15 U.S.C. § 45(a), Section 1303(a) of OPPA, 15 U.S.C. § 6502(a), and the COPPA Rule, 16 C.F.R. pt. 312 (effective July 1, 2013). For these violations, Plaintiff seeks a permanent injunction, civil
enalties, and other relief, pursuant to Sections 5(m)(1)(A) and 13(b) of the FTC ct, 15 U.S.C. §§ 45(m)(1)(A) and 53(b), Sections 1303(c) and 1306(d) of COPPA, 15 U.S.C. §§ 6502(c), 6505(d), and the COPPA Rule, 16 C.F.R. § 312.9.
DEFENDANTS
8. Defendant TikTok Inc. is a California corporation with its principal place of business at 5800 Bristol Parkway, Suite 100, Culver City, California 0230. TikTok Inc. transacts or has transacted business in this District and
hroughout the United States.
9. Defendant TikTok U.S. Data Security Inc. is a Delaware corporation with its principal place of business shared with TikTok Inc. TikTok U.S. Data Security Inc. transacts or has transacted business in this District and throughout the
United States.
10. Defendant ByteDance Ltd. is a Cayman Islands company. It has had offices in the United States and in other countries. ByteDance Ltd. transacts or has transacted business in this District and throughout the United States.
11. Defendant ByteDance Inc. is a Delaware corporation with its principal place of business at 250 Bryant Street, Mountain View, California, 94041.
ByteDance Inc. transacts or has transacted business in this District and throughout the United States.
12. Defendant TikTok Pte. Ltd. is a Singapore company with its principal place of business at 8 Marina View Level 43 Asia Square Tower 1, Singapore,
018960. TikTok Pte. Ltd. transacts or has transacted business in this District and throughout the United States.
13. Defendant TikTok Ltd. is a Cayman Islands company with its principal place of business in Singapore or Beijing, China. TikTok Ltd. transacts or has transacted business in this District and throughout the United States.
COMMON ENTERPRISE
14. Defendants are a series of interconnected companies that operate the TikTok social media platform. Defendant ByteDance Ltd. is the parent and owner of Defendants ByteDance, Inc. and TikTok Ltd. TikTok Ltd. owns Defendants TikTok LLC and TikTok Pte. Ltd. TikTok LLC in turn owns Defendant TikTok Inc., which owns Defendant TikTok U.S. Data Security Inc.
15. Upon information and belief, a group of ByteDance Ltd. and TikTok Inc. executives, including Zhang Yiming, Liang Rubo, Zhao Penyuan, and Zhu Wenjia, direct and control TikTok’s core features and development. Since 2019, ByteDance Ltd. and TikTok Inc. have promoted TikTok in the United States, spending hundreds of millions of dollars on advertising, employing U.S.-based staff and executives, and developing and distributing TikTok to run on Apple and Android devices.
16. ByteDance Inc. and TikTok Inc. have responsibilities for developing, providing, and supporting TikTok in the United States.
17. TikTok Pte. Ltd. serves as the U.S. distributor of TikTok through the Apple App Store and Google Play Store.
18. TikTok Ltd. identifies itself as the developer of TikTok in the Apple App Store, and TikTok Pte. Ltd. identifies itself as the developer of TikTok in the Google Play Store. The tiktok.com domain is registered to TikTok Ltd.
19. Beginning in 2023, TikTok Inc. transferred personal information of children to TikTok U.S. Data Security Inc., which has maintained that data without notice to those children’s parents or parental consent.
20. Defendants share officers and directors. For example, TikTok Inc.’s chief executive officers between 2020 and the present (Kevin Mayer, V Pappas, and Shou Zi Chew), have simultaneously held senior positions at ByteDance Ltd., and ByteDance Ltd.’s chief executive officers (Zhang Yiming and Liang Rubo) have simultaneously served as directors of TikTok Ltd. TikTok Inc.’s Global Chief Security Officer, Roland Cloutier, also served as cyber risk and data security support for ByteDance Ltd. ByteDance Inc. and TikTok Pte. Ltd.’s officers and directors have also overlapped with each other, and with officers and directors of TikTok Inc. Defendants intertwine their finances; for example, ByteDance Ltd. provides compensation and benefits to TikTok Inc.’s CEO, and TikTok Inc. employees participate in ByteDance Ltd.’s stock option plan.
21. Defendants have one centralized bank account for ByteDance Ltd.’s more than a dozen products, including TikTok. Defendants operate on a “shared services” model in which ByteDance Ltd. provides legal, safety, and privacy resources, including personnel. ByteDance’s largest shareholder, Zhang Yiming, signed the 2019 consent order with the United States on behalf of Musical.ly,
TikTok Ltd.’s predecessor company.
22. Defendants have operated as a common enterprise while engaging in the unlawful acts and practices alleged below.
COMMERCE
23. At all times relevant to this Complaint, Defendants have maintained a substantial course of trade in or affecting commerce, as “commerce” is defined in
Section 4 of the FTC Act, 15 U.S.C. § 44.
THE CHILDREN’S ONLINE PRIVACY PROTECTION ACT AND RULE
24. Congress enacted COPPA in 1998 to protect the safety and privacy of children online by prohibiting operators of Internet websites and online services from the unauthorized or unnecessary collection of information of children younger than 13 years old. COPPA directed the FTC to promulgate a rule implementing COPPA. The FTC promulgated the COPPA Rule on November 3, 1999, under Section 1303(b) of COPPA, 15 U.S.C. § 6502(b), and Section 553 of the Administrative Procedure Act, 5 U.S.C. § 553. The Rule went into effect on April 21, 2000. The FTC promulgated revisions to the Rule that went into effect on July 1, 2013. Pursuant to COPPA Section 1303(c), 15 U.S.C. § 6502(c), and Section 18(d)(3) of the FTC Act, 15 U.S.C. § 57a(d)(3), a violation of the Rule constitutes an unfair or deceptive act or practice in or affecting commerce, in violation of Section 5(a) of the FTC Act, 15 U.S.C. § 45(a).
25. The COPPA Rule applies to any operator of a commercial website or online service directed to children. It also applies to any operator of a commercial website or online service that has actual knowledge that it collects, uses, and/or discloses personal information from children. The Rule requires an operator to meet specific requirements prior to collecting, using, or disclosing children’s
personal information online. These requirements include:
a) Posting a privacy policy on its website or online service providing clear, understandable, and complete notice of its information practices, including what information the operator collects from children online, how it uses such information, its disclosure practices for such information, and other specific disclosures set forth in the Rule;
b) Providing clear, understandable, and complete notice of its information practices, including specific disclosures, directly to parents;
c) Obtaining verifiable parental consent prior to collecting, using, and/or disclosing children’s personal information;
d) Providing reasonable means for parents to review personal information collected from children online, at a parent’s request; and
e) Deleting personal information collected from children online, at a parent’s request.
THE 2019 PERMANENT INJUNCTION
26. Musical.ly was a video-based platform with millions of U.S. child users. In February 2019, the United States filed a complaint against Musical.ly and Musical.ly, Inc. alleging violations of the COPPA Rule, 16 C.F.R. pt. 312, and Section 5 of the FTC Act, 15 U.S.C. § 45. See United States v. Musical.ly, et al., No. 2:19-cv-01439-ODW-RAO (C.D. Cal. Feb. 27, 2019) (Dkt. No. 1).
27. On March 27, 2019, this Court entered a Stipulated Order for Civil Penalties, Permanent Injunction, and Other Relief against Musical.ly and Musical.ly, Inc. United States v. Musical.ly, et al., No. 2:19-cv-01439-ODW-RAO (C.D. Cal. Mar. 27, 2019) (Dkt. No. 10) (the 2019 Permanent Injunction). The order imposed a $5.7 million civil penalty; required Defendants to destroy personal information of users under the age of 13 and, by May 2019, remove accounts of users whose age could not be identified; enjoined Defendants from violating the COPPA Rule; and required Defendants to retain certain records related to compliance with the COPPA Rule and the 2019 Permanent Injunction.
28. In April 2019, Musical.ly was renamed TikTok Ltd., and in May 2019, Musical.ly Inc. was renamed TikTok Inc. The renaming did not alter the companies’ compliance obligations under the 2019 Permanent Injunction.
DEFENDANTS’ BUSINESS ACTIVITIES
29. Since before 2019, Defendants have operated TikTok, a video-based social media platform that consumers may access via the Internet or through a downloadable software application or “app.” In November 2017, ByteDance Ltd. purchased Musical.ly and, in 2018 it merged it into TikTok.
30. The TikTok platform allows users to create, upload, and share shortform videos. The TikTok app is free to download. It generates revenue for Defendants through advertising and eCommerce, including through the TikTok for Business platform, as well as in-app purchases of TikTok “coin” through the TikTok Shop.
31. TikTok features a “For You” feed in which an algorithm subject to Defendants’ control selects videos for each user based on its determination of their interests, pushes those videos to the user, and plays them.
32. TikTok’s algorithms are trained on data collected from users via the TikTok platform and from third-party sources. Such data include videos viewed, “liked,” or shared, accounts followed, comments, content created, video captions, sounds, and hashtags, as well as device and account settings such as language preference, country setting, and device type.
33. As of 2024, there are more than 170 million TikTok users in the United States, including many children and teens. In 2022, two-thirds of U.S. teens reported using TikTok, including about 61% of teens aged 13 or 14. By late 2023, nearly half of U.S. teens reported using TikTok multiple times a day.
DEFENDANTS’ UNLAWFUL CONDUCT
34. Defendants have known of COPPA, the COPPA Rule, and their requirements since at least 2017, directly or through their predecessors and affiliates, including through Musical.ly’s and Musical.ly, Inc.’s agreement to the 2019 Permanent Injunction, which requires compliance with COPPA and the
COPPA Rule.
35. TikTok is directed to children (i.e., individuals under age 13, as used herein and in COPPA and the Rule). An online service that does not target children as its primary audience is not deemed directed to children under the COPPA Rule if it satisfies certain criteria. Defendants purport to satisfy these criteria by requiring users creating accounts to report their birthdates. As described in this Complaint, however, Defendants have allowed children to bypass or evade this “age gate” and collected personal information even from individuals who identify themselves as children. Further, as described in this Complaint, Defendants have actual knowledge that they are collecting personal information from children.
36. Defendants have violated COPPA and the COPPA Rule through the conduct described in this Complaint, including by (1) knowingly creating accounts for children and collecting data from those children without first notifying their parents and obtaining verifiable parental consent; (2) failing to honor parents’ requests to delete their children’s accounts and information; and (3) failing to delete the accounts and information of users they know are children.
37. Each time Defendants have collected a child’s personal information without parental notice or verifiable consent, or have failed to delete that information at the request of the child’s parents or upon learning it was collected from a child whose parents’ were not notified or did not provide verifiable consent, Defendants violated COPPA and the COPPA Rule.
38. Defendants’ conduct has resulted in millions of children using TikTok, but the precise magnitude of Defendants’ violations is difficult to determine due to their failure to comply with the 2019 Permanent Injunction’s requirement that they keep records demonstrating its COPPA compliance.
I. Defendants Have Knowingly Created Accounts for Children and Collected Those Children’s Data Without Parental Notice or Consent.
39. Since at least March 2019, Defendants have offered in the United States what they refer to as TikTok for Younger Users or “Kids Mode” (hereinafter “Kids Mode”) to children who identify themselves as being under 13 when they create an account, and a regular TikTok experience to other users. However, Defendants have knowingly allowed children under 13 to create accounts in the regular TikTok experience and collected extensive personal information from those children without first providing parental notice or obtaining verifiable parental consent, as required by the COPPA Rule. Defendants have also violated the COPPA Rule by collecting, without parental notice and consent, several varieties of personal information from children with Kids Mode accounts, and by using children’s information in ways that the COPPA Rule prohibits.
A. Defendants Allowed Children to Evade or Bypass TikTok’s Age Gate
40. Since at least March 2019, when consumers in the United States attempt to create a TikTok account, they generally have had to go through the platform’s “age gate” by providing a birthday (day, month, and year). If a consumer indicates that they are 13 or older, they are prompted for a username, password, and email address or phone number. Defendants then create a regular account for the user, and the user can view, create, post, and share videos, as well as message other TikTok users.
41. For TikTok users who self-identify as 13 or older at the age gate, Defendants collect a wide variety of personal information, such as first and last name, age, email address, phone number, persistent identifiers for the device(s) used to access TikTok, social media account information, and profile image(s), as well as photographs, videos, and audio files containing the user’s image and voice and the metadata associated with such media (such as when, where, and by whom the content was created).
42. Over time, Defendants collect increasingly more information from these users, including usage information, device information, location data, image and audio information, metadata, and data from cookies and similar technologies that track users across different websites and platforms.
43. Since at least March 2019, if a U.S. consumer inputs into the age gate a birthday indicating they are a child under 13 years old, the child generally is prompted to provide a username (that does not include any personal information) and a password. The TikTok platform then creates an account for that child in Kids Mode. Defendants do not notify parents or obtain parental consent for Kids
Mode accounts.
44. In Kids Mode, a user can view videos but cannot create or upload videos, post information publicly, or message other users. Defendants still collect and use certain personal information from children in Kids Mode.
45. Defendants’ methodologies for screening out child users are deficient in multiple ways. Until at least late 2020, if a child in the U.S. submitted a birthday reflecting that they were under 13 years old, the TikTok platform did not prevent the child from evading the age gate by trying again: i.e., restarting the account creation process and giving the age gate a birthday indicating they were 13
or older, even though by that point Defendants knew from the birthday the user had previously provided that the user was a child.
46. Until at least May 2022, Defendants offered consumers a way to avoid the TikTok age gate altogether when creating a TikTok account, by allowing them to use login credentials from certain third-party online services, including Instagram and Google. Defendants internally identified these TikTok accounts as “age unknown” accounts.
47. For example, Defendants allowed children to create TikTok accounts without age gating them by letting children use login credentials from Instagram, even though Instagram did not itself require users to disclose their age or date of birth to create an Instagram account until at least December 2019.
48. Defendants also allowed children to create TikTok accounts without age gating by letting children use login credentials from Google. Google allowed children under the age of 13 to create Google accounts with parental consent to use Google.
49. Defendants’ insufficient policies and practices thus allowed children to create a non-Kids Mode TikTok account, gaining access to adult content and features of the general TikTok platform without providing age information. Without parental notice or consent, Defendants then collected and maintained vast amounts of personal information from the children who created and used these
regular TikTok accounts.
50. These policies and practices led to the creation of millions of accounts for which Defendants did not know the age of the user.
51. Defendants did not start requiring all users to go through a TikTok age gate until at least 2022, closing what employees internally described in early 2021 as an age gate “loophole.”
B. Defendants Failed to Comply with COPPA and the COPPA Rule Even for Accounts in “Kids Mode”
52. In Kids Mode, Defendants collect and maintain a username, password, and birthday (day, month, and year). They have also collected several types of persistent identifiers from Kids Mode users without notifying parents or obtaining their consent, including IP address and unique device identifiers.
53. The COPPA Rule permits operators to collect a persistent identifier from children under certain circumstances without first obtaining verifiable parental consent, but only if no other personal information is collected and the identifier is used for the sole purpose of providing support for the online service’s internal operations. See 16 C.F.R. § 312.4(c)(7). Defendants’ collection and use of persistent identifiers from Kids Mode users do not comply with this provision.
54. Defendants additionally collect dozens of other types of information concerning child users with Kids Mode accounts—including app activity data, device information, mobile carrier information, and app information—which they combine with persistent identifiers and use to amass profiles on children.
55. Defendants did not need to collect all of the persistent identifiers they have collected from users in Kids Mode to operate the TikTok platform.
56. Until at least mid-2020, Defendants shared information they collected from children in Kids Mode with third parties for reasons other than support for internal operations. Defendants did not notify parents of that practice.
57. For example, Defendants shared this information with Facebook and AppsFlyer, a marketing analytics firm, in part to encourage existing Kids Mode users whose use had declined or ceased to use Kids Mode more frequently. Defendants called this process “retargeting less active users.” This practice used children’s personal information for reasons beyond support for the internal operations of Kids Mode and thus was not permitted by the COPPA Rule.
58. Separately, users in Kids Mode can send feedback to TikTok using an in-app “Report a Problem” function. When doing so, Defendants require the child to enter the child’s email address.
59. Between February 2019 and July 2022, for example, Defendants collected over 300,000 problem reports from users in Kids Mode that included children’s email addresses.
60. Defendants did not delete these children’s email addresses after processing the reports, and thus retained these email addresses longer than reasonably necessary to fulfill the purpose for which the information was collected, in violation of the Rule. See 16 C.F.R. § 312.10. Defendants did not notify
parents of this ongoing practice.
II. Defendants Have Obstructed and Failed to Honor Parents’ Requests to Delete Their Children’s Accounts and Data.
61. Since 2019, Defendants have allowed millions of children to create general TikTok accounts—i.e., accounts outside of Kids Mode.
62. Many children create and use a general TikTok account without their parents’ knowledge. Frequently, however, a parent becomes aware that their child has a general TikTok account and seeks to have it and its associated data deleted.
63. The COPPA Rule and the 2019 Permanent Injunction require Defendants to delete personal information collected from children at their parents’ request. Nevertheless, in many instances Defendants have obstructed parents’ ability to make such requests and have failed to comply with these requests.
A. Defendants Maintained an Unreasonable Process for Parents to Request Deletion of their Children’s Data
64. Defendants failed to create a simple process for parents to submit a deletion request. For example, the word “delete” does not appear in many of Defendants’ online parental guidance materials, such as TikTok’s “Guardian’s Guide,” the “Privacy and Security on TikTok” page, TikTok’s “New User Guide,” and other materials on tiktok.com such as the “Parental Controls Guide” and “The
Parent’s Guide to TikTok.”
65. Parents must navigate a convoluted process to figure out how to request deletion of their child’s account and information. For example, as recently as 2023, a parent visiting tiktok.com to request deletion of their child’s TikTok account and information had to scroll through multiple webpages to find and click on a series of links and menu options that gave no clear indication they apply to such a request. Parents then had to explain in a text box that they are a parent who wanted their child’s account and data to be deleted.
66. At times, Defendants also directed parents to send their requests to delete their children’s accounts and personal information to an email address. As detailed below, in many cases Defendants failed to respond in a timely manner to these requests, or simply failed to respond to them at all.
67. Even if a parent succeeded in submitting a request to delete their child’s account and information, Defendants often did not honor that request. In response to each request, Defendants’ staff would review the account for “objective indicators” that the account holder was under 13, or “underage,” based on the user’s handle, biography or “bio,” Under Defendants’ policy, an account would be identified as an underage account and deleted only if the reviewed elements contained an explicit admission that the user was under 13—for example, “I am in first grade” or “I am 9 years old”— To determine whether a child was younger than 13, Defendants instructed reviewers to use
68. If the account failed to meet Defendants’ rigid criteria, Defendants’ policy until recently was to respond to the underage account deletion request by asking the parent to complete and sign a form confirming their relationship to the child and the nature of the request. The parent had to certify under penalty of perjury that they were the parent or guardian of the account user. Defendants required parents to complete the form regardless of whether the parent had already provided Defendants with all of the information the form requested.
69. If a parent or guardian did not submit the secondary form, Defendants would not delete the child’s regular TikTok account, which remained active.
70. Defendants’ policies and practices subverted parents’ efforts to delete their children’s accounts and resulted in Defendants retaining children’s accounts—and personal information—even though their parents identified them as children and asked TikTok to delete their accounts.
71. Defendants were well aware this was occurring. For example, in a 2018 exchange, a high-level employee of Defendants explicitly acknowledged that Defendants had “actual knowledge” of children on TikTok upon receiving the first parental request, and yet did not delete children’s accounts upon receiving the request. In the exchange, the former CEO of TikTok Inc. communicated about underage users on TikTok with the executive responsible for child safety issues in the United States. The employee in charge of child safety issues questioned why parents had to fill out a second form after they already provided the necessary information, noting: “Why we reply with this template everytime [sic] when we already have all the info that’s needed? [I]n this case, we already have the
username, the name of the reporter, and the age, yet we still reply with the template.” He added that if the person reporting the account “doesn’t reply then we have actual knowledge of underage user and took no action!”
72. Despite this awareness that they were failing to respect parents’ deletion requests, Defendants continued using this flawed process through 2023.
B. Defendants Failed to Delete Children’s Data upon Parental Request and Cease Collecting Children’s Personal Information
73. In addition to using what they knew to be a flawed process to address parents’ deletion requests, Defendants in many cases did not respond to parents’ requests at all. As of late December 2020, Defendants had a backlog of thousands of emails dating back months requesting that TikTok delete individual children’s accounts.
74. Defendants’ inadequate policies and inaction led to numerous children continuing to maintain regular TikTok accounts even though their parents had asked Defendants to delete those accounts. In a sample of approximately 1,700 children’s TikTok accounts about which Defendants received complaints and deletion requests between March 21, 2019, and December 14, 2020, approximately 500 (30%) remained active as of November 1, 2021. Several hundred of these accounts were still active in March 2023. This sample of children’s accounts is likely a small fraction of the thousands of deletion requests Defendants received and failed to act on.
75. Many parents made multiple requests for Defendants to remove their children’s account and personal information. On at least some occasions, even when a parent or guardian completed Defendants’ secondary form, Defendants still failed to delete their children’s accounts and information.
76. Compounding these problems, even when Defendants did delete a child’s account and personal information at their parent’s request, at least until recently, Defendants did nothing to prevent the same child from re-creating their account with the same device, persistent identifiers, and email address or phone number as before. This means that a child whose account has been removed could
simply create a new account.
III. Defendants Have Failed to Delete Children’s Accounts and Information Identified by Their Own Systems and Employees.
77. Defendants purport to use technology, user reports, and human moderation to identify children’s TikTok accounts so that those accounts and the information collected from them can be deleted. But Defendants know their processes and policies are deficient, and they fail to delete accounts and information that even their own employees and systems identify as belonging to children.
A. Defendants’ “Keyword Matching” Process
78. Since approximately 2020, Defendants have used “keyword matching” purportedly to identify children’s accounts for deletion. Defendants’ keyword matching process searches users’ profiles for terms deemed likely to correspond to child accounts—for example, “4th grade” and “9 years old”—and submits accounts that include those terms for review and potential removal. Defendants’ keyword matching practices have proven woefully deficient.
79. Defendants’ human content moderators review accounts flagged as potentially belonging to children by the keyword matching process or by other methods. Similar to Defendants’ restrictive approach to parental deletion requests, the content moderators who review accounts may delete them as belonging to children only if rigid criteria are satisfied. For example, under the policy, an account can be marked as underage and deleted only if either there is an explicit admission of an age under 13 or
80. Earlier versions of the policy were even more restrictive. For example, to mark and delete an account as underage, the policy between the spring of 2020 and early 2021 required an explicit admission of age, regardless of what videos the account had posted. The pre-April 2020 version of the policy required both (i) an explicit admission of age and (ii) that
81. Defendants’ content moderators are not told why an account was flagged as possibly underage and cannot access any videos posted by the user beyond – even though the account may have dozens or hundreds of videos revealing that the user is a child. The moderators cannot view other information about the accounts they are reviewing either, including the videos watched by the user or the accounts the user follows. If the policy’s rigid criteria are not met, content moderators have no discretion to designate an account as underage; they must allow any such account to remain on the platform even if they know the account holder is in fact a child.
82. Defendants have also failed to allow content moderators sufficient time to conduct even the limited review they permit. At times since entry of the 2019 Permanent Injunction, TikTok has had tens of millions of monthly active users in the United States. Meanwhile, TikTok Inc. ‘s content moderation team included fewer than two dozen full-time human moderators responsible for
identifying and removing material that violated all of its content-related policies, including identifying and deleting accounts of unauthorized users under age 13.
83. During at least some periods since 2019, TikTok Inc.’s human moderators spent an average of only five to seven seconds reviewing each account flagged by a keyword to determine if it belonged to a child.
84. The deficiency of Defendants’ policies is shown by the fact that regular TikTok accounts belonging to children can be easily found by searching for the same basic terms and variations used by Defendants’ keyword matching algorithm. Some of these accounts have existed for long periods—able to garner hundreds of followers and hundreds or even thousands of “likes,” a sign of approval by other TikTok users.
85. By adhering to these deficient policies, Defendants actively avoid deleting the accounts of users they know to be children. Instead, Defendants continue collecting these children’s personal information, showing them videos not intended for children, serving them ads and generating revenue from such ads, and allowing adults to directly communicate with them through TikTok.
Accounts Referred from Video Moderation Queues
86. Many accounts that belong to children come to Defendants’ attention when one user reports another user’s video as violating one of Defendants’ policies. Those videos are then added to “video queues” and reviewed by human content moderators who review the videos to determine whether they comply with Defendants’ policies. If those content moderators encounter a video that depicts a
child under 13, they can apply labels to designate suspected child users, such as “Content Depicting Under the Age of Admission” or “Suspected Underaged User.”
These moderators can remove a specific video from TikTok, but they lack authority to delete or remove the account even if it is clearly the account of a child.
Instead, by applying the labels, they refer the video to the separate content moderation team that assesses whether accounts belong to underage users (the “underage queue”).
87. Until at least October 2022, however, this process did not work. Accordingly, when Defendants’ moderators tagged specific videos as depicting a child under 13, the associated accounts were not actually referred to the team authorized to delete the associated account. Instead, those accounts remained live, and Defendants continued to collect and retain those children’s personal information and to show them videos and messages from regular TikTok users. Due to Defendants’ recordkeeping deficiencies, detailed below, they cannot identify the number of accounts affected by this issue. The limited records Defendants do have, however, make clear that millions of accounts were involved.
C. Accounts Identified in Quality Assurance Reviews
88. Defendants conduct quality assurance reviews of the content moderation processes described above. The quality assurance reviews require content moderators to re-review a subset of previously reviewed accounts or videos. This process aims to identify instances in which TikTok content moderators incorrectly applied company policies to those accounts or videos.
89. Until at least September 2022, however, when Defendants’ quality assurance analysts identified a specific account that a moderator incorrectly failed to flag for deletion as belonging to a child, Defendants did not then go back and delete the account. Instead, the account remained live. Accordingly, Defendants failed to delete numerous children’s accounts that their own quality assurance team specifically identified as belonging to children.
D. Accounts That Moderators Have Marked “Ban as Underage”
90. Even where accounts satisfied Defendants’ rigid criteria, were identified as belonging to children, and were marked for deletion, Defendants failed to delete many of the accounts.
91. Internal communications reveal that Defendants’ employees were aware of this issue. In a September 2021 online chat, for example, employees discussed the fact that accounts were being marked as banned for underage but were not being deleted, and suggested this had been occurring since mid-July 2020. One employee noted that she was seeing this “a lot” and “I run across usually like 3-4 accounts [like that] a day,” while another noted “[t]hat shouldn’t be happening at all or we can get in trouble … because of COPPA.”
92. Even though Defendants were aware of this problem, and the 2019 Permanent Injunction required them to maintain records regarding their COPPA compliance or lack thereof, they failed to retain records documenting this issue and the accounts affected. The extremely limited records Defendants have produced to the government reveal that even for small segments of the time period at issue, at least several hundred accounts were affected.
E. Data Collected From Purportedly Deleted Accounts
93. Defendants retain children’s personal information long after they identify an account as belonging to a child and determine they should delete information related to the account. For example, Defendants retain app activity log data related to children for 18 months.
94. Moreover, Defendants have retained children’s information in numerous database locations long after purportedly deleting their accounts. Defendants have not documented what information collected from users is saved in what locations or why, and they have been unable to explain how or why the information was in those locations, or why it was not deleted.
95. Defendants have also failed to delete information children posted to TikTok that was later incorporated into other users’ videos, even when Defendants possessed identifiers linking the information to an account that they deleted because it belonged to a child. For example, until at least 2022, Defendants retained sound recordings of numerous children from accounts Defendants had determined belonged to children, and those sound recordings continued to appear
in other users’ videos.
96. Similarly, Defendants retained profile photographs of users that Defendants knew to be children. For example, TikTok allows users to include in their videos another user’s comment, which is displayed alongside the commenter’s photograph and username. When Defendants did “delete” the
account of a child, that child’s comments remained in other users’ posts, along with their photograph and username. These images had unique identifiers that tied each child’s photograph, username, and comment to an account that Defendants knew had been deleted because it belonged to a child.
IV. Defendants’ Violations Have Occurred on a Massive Scale.
A. Defendants’ Policies Result in Millions of Children Using TikTok
97. As discussed above, Defendants adopted and implemented inadequate and ineffective policies to stop children from creating general TikTok accounts and to remove those accounts when they were discovered. As a result, for years millions of American children under 13 have been using TikTok and Defendants have been collecting and retaining children’s personal information.
98. Defendants’ internal analyses show that millions of TikTok’s U.S. users are children under the age of 13. For example, the number of U.S. TikTok users that Defendants classified as age 14 or younger in 2020 was millions higher than the U.S. Census Bureau’s estimate of the total number of 13- and 14-year olds in the United States, suggesting that many of those users were children younger
than 13.
99. Third-party studies shared with TikTok Inc. similarly show that in the United States and other countries, child usage of TikTok is common and large numbers of children have regular TikTok accounts. In fact, regulators in other countries, including the Netherlands, Ireland, and the United Kingdom, have fined Defendants for impermissibly collecting data from children.
100. Defendants and their employees have long known that children misrepresent their ages to pass through TikTok’s age gate, and that despite other measures purportedly designed to remove children from the platform, children are ubiquitous.
101. In January 2020, for example, a TikTok moderator recognized that Defendants maintain accounts of children despite the “fact that we know the user is U13,” i.e., under age 13, so long as the child’s profile does not admit that fact explicitly. Another employee admitted that TikTok moderators were required to ignore any “external information” indicating that a user under review is a child.
102. As another example, in a July 2020 chat, one of Defendants’ employees circulated the profiles of numerous underage users he had identified “literally through one minute of scanning,” noting “[t]his is incredibly concerning and needs to be addressed immediately.”
103. Defendants have other methods to identify and remove children’s accounts from the general TikTok platform but do not use them for that purpose. For example, TikTok has its own age-determining technology—“grade level,” the algorithm for which is based on users’ behavior and other metrics—for purposes such as advertising. Unlike TikTok’s age gate, this method is based on observable behaviors and not solely users’ self-reported age. Defendants have not used it to attempt to identify children on the platform so that their accounts can be removed.
104. In a November 2019 message, a company employee told TikTok Inc.’s then-head of content partnerships, who led its relationships with major brands, that “we have two age level . . . one is age gate and one is grade level.” He continued that the age gate is “filled in by users themselves” and “many of them will fill in false information,” while “grade level [is] calculated by algorithm . . . through user’s behavior or other metrics, which are more accurate.” He went on that, for purposes of a search, “I used grade level so we will see many users under 13.”
105. Not only do Defendants not use their grade level technology to identify and remove children from the TikTok general platform, but they appear to have programmed grade level to avoid gaining knowledge that users were under 13. In 2020, Defendants’ lowest age group band was for ages under 15, meaning that it would not identify users as under 13 specifically. Defendants later revised
this age cutoff so that the lowest age segment was under 16.
B. Defendants Failed to Keep Records Required by the 2019 Permanent Injunction
106. The 2019 Permanent Injunction required TikTok Inc. and TikTok Ltd. to create and maintain all records necessary to demonstrate full compliance with the 2019 Permanent Injunction, including records to show full compliance with COPPA and the COPPA Rule. Defendants have failed to create and maintain all such records.
107. First, when Defendants identified issues concerning their COPPA compliance, they frequently failed to maintain records that would be needed to show how many accounts were affected, which accounts were affected, and what, if anything was done to remedy the issues. For example, as noted above, Defendants did not maintain records regarding accounts that were referred to the underage queue from the video queue but not actually reviewed, or regarding their failure to delete children’s accounts that had been designated as underage.
108. Further, Defendants have failed to create or maintain records sufficient to document their moderators’ review of regular accounts identified as potentially belonging to children and the actions taken as a result. When asked by the United States for documentation of certain specific accounts of children, Defendants initially produced no records and claimed their account records were “not intended to be reviewed in the ordinary course of business.” The records Defendants subsequently produced do not make it possible to systematically determine what action has been taken on specific accounts and why.
109. Additionally, Defendants’ employees use Feishu (sometimes referred to as Lark), a ByteDance Ltd. corporate messaging and office collaboration platform, to communicate with each other. Defendants enabled features in Feishu, such as one called “recall,” that allow employees to easily erase internal communications, leaving no record of the communication. Employees used the feature to delete messages permanently, including, potentially, messages relevant to compliance with the 2019 Permanent Injunction and COPPA. Defendants did not change this practice until at least May 2023.
110. Defendants enabled another feature in Feishu that allows employees to choose when their communications will be deleted.
111. A late 2021 risk assessment for Defendant ByteDance Ltd. found that the company was incapable of extracting accurate and usable records about and from internal Lark messages. The risk assessment found that because they used Feishu, Defendants lacked a reliable way to memorialize the vast majority of employees’ business communications and could not assure preservation in compliance with government investigations and litigation subpoenas.
C. TikTok Inc. Misrepresented its Remedial Conduct to the FTC
112. On June 12, 2020, TikTok Inc. stated to the FTC that “[o]n May 11, 2019 . . . [it] took offline all US accounts that did not go through [its then-recently imposed] age gate. These accounts . . . were not accessible to the Company. TikTok did not use or disclose the information for any purpose.” TikTok Inc. also stated that it “completed on May 24, 2020” the deletion of children’s data as required by the 2019 Permanent Injunction. V Pappas, as “GM of TikTok,” certified on TikTok Inc.’s behalf under penalty of perjury that the prior statement was true and correct.
113. After follow-up inquiry by the FTC, TikTok Inc. acknowledged that its June 12, 2020, claims had been false. In fact, TikTok Inc. had retained and been using data that it previously represented it “did not use,” was “not accessible” to it, and was “delet[ed].” That data included personal information and other data of child, teen, and adult users, including IP addresses, device IDs, device models, and
advertising IDs.
* * *
114. Based on the facts and violations of law alleged in this Complaint, the United States has reason to believe that Defendants are violating or are about to violate COPPA, the COPPA Rule, and the FTC Act.
VIOLATIONS OF COPPA, THE COPPA RULE AND THE FTC ACT
115. Paragraphs 1 through 114 are incorporated as if set forth herein.
116. Defendants are “operators,” under 16 C.F.R. § 312.2, and thus subject to the COPPA Rule.
117. Defendants collect personal information from children through the TikTok app and website, which are both online services or websites directed to children. Defendants have actual knowledge that they are collecting personal information from children.
118. In numerous instances, in connection with the acts and practices described above, Defendants collected, used, and disclosed personal information from children in violation of COPPA and the COPPA Rule, including by:
a) Failing to provide notice on their website or online service of what information they collect from children, how they use such information, their disclosure practices, and other content required by the Rule, in violation of Sections 312.3(a) and 312.4(d) of the Rule, 16 C.F.R. §§ 312.3(a), 312.4(d);
b) Failing to make reasonable efforts to provide direct notice to parents of what information they collect online from children, how they use such information, their disclosure practices for such information, and other content required by the Rule, in violation of Sections 312.4(b) and 312.4(c) of the Rule, 16 C.F.R. §§ 312.4(b)–(c);
c) Failing to obtain consent from parents before any collection, use, or disclosure of personal information from children, in violation of Sections 312.3(b) and 312.5(a)(1) of the Rule, 16 C.F.R. §§
312.3(B), 312.5(a)(1);
d) Failing to provide a reasonable means for a parent to refuse to permit the further use or maintenance of any personal information collected from a child, in violation of Sections 312.3(c) and
312.6(a)(2)-(3) of the Rule, 16 C.F.R. §§ 312.3(c), 312.6(a)(2)-(3); e) Failing to provide parents the opportunity at any time to direct Defendants to delete personal information collected from children, in violation of Section 312.6(a)(2) of the Rule, 16 C.F.R. § 312.6(a)(2);
f) Failing to delete, at the request of parents, personal information collected from children, in violation of Section 312.6(a)(2) of the Rule, 16 C.F.R. § 312.6(a)(2);
g) Retaining personal information collected online from children for longer than reasonably necessary to fulfill the purpose for which the information was collected, in violation of Section 312.10 of the Rule, 16 C.F.R. § 312.10;
h) Failing to timely delete personal information collected from children in order to respond on a one-time basis to a specific request, in violation of Section 312.5 of the Rule, 16 C.F.R. § 312.5(c)(3);
i) Failing to limit their collection of children’s personal information for which they lacked verifiable parental consent to only the limited information permitted by the Rule’s exceptions to prior
parental consent requirements, in violation of Section 312.5(c) of the Rule, 16 C.F.R. § 312.5(c);
j) Failing to limit use of children’s personal information for which they lacked verifiable parental consent to solely the purposes permitted by the Rule (such as the use of a persistent identifier for the sole purpose of providing support for the internal operations of their website or online service, permitted by Section 312.4(c)(7), of the Rule) in violation of Section 312.5(c) of the Rule, 16 C.F.R.
§ 312.5(c); and
k) Conditioning children’s participation in the online service by requiring the disclosure of more personal information than is reasonably necessary to participate, in violation of Section 312.7 of
the Rule, 16 C.F.R. § 312.7s.
119. Pursuant to Section 1303(c) of COPPA, 15 U.S.C. § 6502(c), and Section 18(d)(3) of the FTC Act, 15 U.S.C. § 57a(d)(3), a violation of the Rule constitutes an unfair or deceptive act or practice in or affecting commerce, in violation of Section 5(a) of the FTC Act, 15 U.S.C. § 45(a).
120. Defendants violated the Rule as described above with the knowledge required by Section 5(m)(1)(A) of the FTC Act, 15 U.S.C. § 45(m)(1)(A).
121. Each collection, use, or disclosure of a child’s personal information in which Defendants violated the Rule in any of the ways described above constitutes a separate violation for which Plaintiff seeks monetary civil penalties. 15 U.S.C. § 45(m)(1)(A).
122. Each day Defendants maintained data collected in violation of the Rule, or otherwise continued to collect such data, is a continuing failure to comply with the Rule and constitutes a separate violation under 15 U.S.C. § 45(m)(1)(C).
123. Section 5(m)(1)(A) of the FTC Act, 15 U.S.C. § 45(m)(1)(A), as modified by Section 4 of the Federal Civil Penalties Inflation Adjustment Act of 1990 and Section 701 of the Federal Civil Penalties Inflation Adjustment Act Improvements Act of 2015, 28 U.S.C. § 2461, and Section 1.98(d) of the FTC’s Rules of Practice, 16 C.F.R. § 1.98(d), authorizes this Court to award monetary civil penalties of not more than $51,744 for each violation of the Rule assessed after January 10, 2024.
CONSUMER INJURY
124. Consumers are suffering, have suffered, and will continue to suffer substantial injury as a result of Defendants’ violations of the COPPA Rule. Absent injunctive relief by this Court, Defendants are likely to continue to injure consumers and harm the public interest.
PRAYER FOR RELIEF
125. Wherefore, Plaintiff requests that the Court:
A. Enter a permanent injunction to prevent future violations of the
COPPA Rule by Defendants;
B. Impose civil penalties on each Defendant for every violation of the
COPPA Rule; and
C. Award any additional relief as the Court determines to be just and
proper.