Children's Privacy on the Frontline Social Card copy

Children’s Privacy on the Frontline: Legal Battles and Regulatory Updates Across the U.S.

Background

In the past month, legal and regulatory updates regarding children’s privacy have surged across the United States, aligning with predictions made earlier this year on our PrivacyCafé podcast. From California’s ongoing amendments to the California Consumer Privacy Act aimed at enhancing protections for minors to recent court rulings and proposed regulations in other states, the landscape of children’s online privacy is rapidly evolving. This article provides a detailed recap of the most significant developments, including the California Age-Appropriate Design Code Act of 2022, recent actions by the U.S. Department of Justice and Federal Trade Commission against TikTok for alleged COPPA violations, and new child privacy rulemaking initiatives in New York. As these updates unfold, they underscore the ongoing tension between protecting children’s privacy and balancing other constitutional rights, such as free speech.

California Age Appropriate Design Code Act of 2022

The California Age-Appropriate Design Code Act of 2022 (AB 2273) was designed to protect children’s privacy and safety online. One of its key provisions requires companies to conduct Data Protection Impact Assessments (DPIAs) for online services likely to be accessed by children.

On August 16, the Ninth Circuit delivered an opinion in NetChoice v. Bonta, concerning the constitutionality of the California Age-Appropriate Design Code Act of 2022 (“AADC”) AB 2273. In short, the Court affirmed the lower court’s preliminary injunction with respect to the AADC’s data protection impact assessment requirements, finding that NetChoice is likely to succeed in showing that the law’s DPIA provisions violate the First Amendment.

The Ninth Circuit’s decision has significant implications, particularly concerning the balance between protecting children’s privacy online and upholding First Amendment rights. Here’s a breakdown of the impact and analysis of the decision:

  1. Affirmation of First Amendment Protections: The decision underscores the strong protections afforded to free speech under the First Amendment, even in the context of regulations aimed at protecting children. The ruling emphasizes that laws targeting speech-related activities, such as data collection and processing, must be narrowly tailored to serve a compelling state interest without unduly restricting free expression.
  2. Chilling Effect on Free Speech: The court found that the data protection impact assessment (DPIA) requirements of the AADC could have a chilling effect on speech. The DPIAs were seen as compelled speech that required companies to engage in an extensive analysis and potentially alter their speech-related activities based on vague and broad criteria, leading to self-censorship.
  3. Delay in Enforcement of AADC: The preliminary injunction means that certain provisions of the AADC, particularly the DPIA requirements, cannot be enforced until the case is fully resolved. This delays the implementation of these protections for children’s online privacy, highlighting the tension between privacy and free speech rights.
  • Analysis on How the AADC Violates the First Amendment
  1. Compelled Speech: The AADC’s DPIA provisions require businesses to evaluate and document the risks their online services pose to children and to assess how these risks can be mitigated. This requirement effectively compels businesses to speak in a particular way and to engage in a form of expression that is dictated by the government. The Ninth Circuit found this to be a form of compelled speech that is not permissible under the First Amendment.
  2. Vagueness and Overbreadth: The court noted that the DPIA requirements were overly broad and vague, making it difficult for companies to know what is required of them. Such vagueness can lead to self-censorship, as companies might overly restrict their own speech to avoid potential penalties, thus chilling lawful speech in the process.
  3. Lack of Narrow Tailoring: For a law that affects free speech to withstand constitutional scrutiny, it must be narrowly tailored to achieve a compelling state interest. The Ninth Circuit found that the DPIA provisions were not narrowly tailored because they imposed broad obligations on all companies, regardless of whether their products were specifically directed at children or posed any actual risk to them. This overreach goes beyond what is necessary to protect children and therefore violates the First Amendment.
  4. Balancing Privacy and Free Speech: While the state has a compelling interest in protecting children’s privacy online, the court found that the AADC’s DPIA provisions did not strike the right balance between this interest and the constitutional rights of businesses. The imposition of DPIAs was seen as an excessive burden that did not adequately justify the intrusion into First Amendment rights.

Overall, the Ninth Circuit’s ruling highlights the complexities involved in regulating the digital space, particularly when such regulations intersect with fundamental constitutional rights. It serves as a reminder that even well-intentioned laws aimed at protecting vulnerable populations must be carefully crafted to avoid unintended infringements on free speech.

U.S. DOJ and FTC Sues TikTok for COPPA Violations

On August 2, 2024, the United States Department of Justice and the Federal Trade Commission sued the popular video-sharing app TikTok — and its parent company ByteDance — alleging that the platform violated the law in collecting and maintaining data about children under 13.

In particular, the government’s lawsuit says that TikTok allowed kids to create accounts without a parent’s permission, retained personal information about them, and failed to respond to caregivers’ requests to delete the data, all in contravention of the Children’s Online Privacy Protection Act (“COPPA”). The suit also alleges that, even in the app’s “restricted mode” — which is intended for children — TikTok illegally collected and stored information about underaged users. The restricted mode misled parents and users about TikTok’s data collection policies, failing to provide adequate notice about what data was being collected and how it was being used.

The lawsuit comes after TikTok settled a previous legal dispute with the Federal Trade Commission in 2019 after the agency accused the social video app of violating the Children’s Online Privacy Protection Act. TikTok paid a civil penalty of $5.7 million. Since then, the Department of Justice said, TikTok has been under a court order to remain compliant with the act.

In September, the Irish Data Protection Commission (DPC) fined TikTok $368 million (€345 million) for violating the privacy of children between the ages of 13 and 17 while processing their data, according to multiple articles of the European Union’s General Data Protection Regulation (GDPR). The DPC also found that the company employed “dark patterns” during registration and posting videos, subtly guiding users to select options that compromised their privacy. In January 2023, TikTok was also fined $5.4 million (€5 million) by France’s data protection authority (CNIL) for insufficiently informing users about how it uses cookies and making it difficult to opt out.

Utah Department of Commerce Issues Draft Regulations for the Minor Protection in Social Media Act

The Utah Department of Commerce has issued draft regulations to implement the Minor Protection in Social Media Act (SB 194 + HB 464). These regulations are designed to clarify the process for social media companies to accurately identify users under 18 years of age with at least 95% accuracy and to secure verifiable parental consent before allowing minors to create accounts.

To achieve the required accuracy for age verification, the draft rules mandate that social media companies or their agents randomly review at least 1,400 unique age assurance attempts from Utah users to evaluate the system’s effectiveness, although specific guidance on this review process is not provided. For obtaining parental consent, the regulations align with federal COPPA guidelines and additionally require a “written attestation” from the minor’s parent or guardian. Moreover, the regulations stipulate that any data used for age verification must be processed within the United States.

The Department projects that the cost of age verification will range from 5 to 45 cents per attempt, and companies will incur an annual fee of at least $2,000 for geolocation services to comply with the Act. A public hearing on these draft regulations was held on August 29, with public comments due by September 16. The regulations are expected to take effect on October 10, 2024, although ongoing litigation in the NetChoice v. Reyes case, which challenges the Act on First Amendment grounds, may impact this timeline.

California Consumer Privacy Act (CCPA) Progressing Toward Significant Amendments to Enhance Privacy Protections for Minors

As of the latest update, California Assembly Bill 1949 (AB 1949) has progressed significantly through the legislative process. The bill aims to amend various sections of the California Consumer Privacy Act (CCPA) to enhance privacy protections for minors. Specifically, it prohibits businesses from collecting, selling, sharing, using, or disclosing personal information of individuals under 18 years of age without proper consent—either from the individual if they are 13 to 17 years old or from a parent/guardian if under 13. The bill also mandates that businesses with actual knowledge of a consumer’s age or those who disregard the consumer’s age are considered to have actual knowledge.

The bill was ordered to a third reading in the Senate on August 19, 2024. It had previously passed the Assembly and was referred to various Senate committees, including Appropriations. The Senate Appropriations Committee placed the bill on suspense file but later moved it forward with a “Do Pass” recommendation.

New York Initiates Child Privacy Rulemaking

On August 1, 2024, the Office of the New York Attorney General issued two Advanced Notices of Proposed Rulemakings (ANPRM) for the state’s new child online privacy and safety laws, the Stop Addictive Feeds Exploitation for Kids Act (SAFE for Kids) and the New York Child Data Protection Act (NYCDPA).

The SAFE for Kids Act mandates that social media platforms verify the age of users and obtain verifiable parental consent before algorithmically curating content for users under 18. The NYCDPA focuses on data minimization, limiting the processing of minors’ personal information to what is “strictly necessary” for purposes such as providing requested services, conducting internal business operations, fulfilling legal obligations, or preventing security threats; informed consent may also permit data processing. Additionally, the NYCDPA also provides for the creation of a novel class of “age-flag” device signals intended to communicate whether a user is a covered minor.

The ANPRM invites public input on various issues, including best practices for age determination, standards for obtaining verifiable parental consent, the definition of an “addictive social media platform,” and criteria for identifying child-directed services. Although the concept of “age-flags” presents unique technical and policy challenges, the ANPRM seeks high-level input on relevant factors and standards rather than offering specific guidance.

Public comments are due by September 30, and while these questions are not part of the formal rulemaking required by the NYS Administrative Procedure Act, they will contribute to the official record. The feedback collected will inform a future Notice of Proposed Rulemaking (NPRM), which will also have its own comment period.

Separately, on July 15 the New York Attorney General’s Office published detailed cookie banner guidance following an enforcement sweep. As New York does not have a comprehensive consumer privacy law, the AG ties these actions to the state’s authority to regulate deceptive practices. See part of the guidance below:

Do:

  • use plain, clear language
  • label buttons to clearly convey what they do
  • make the interface accessible. For example, a visitor should be able to use their keyboard to tab to the privacy controls
  • give equivalent options equal weight. For example, if consumers can agree to tracking with a single click, let them decline with a single click. An opt-in model could have a tracking pop-up that provides “Accept” and “Decline” buttons that are equal in size, color, and emphasis

Do not:

  • use large blocks of text that consumers are unlikely to read
  • use ambiguous buttons. For example, consumers may think clicking “X” in the corner of a cookie banner means they are rejecting cookies
  • use complicated language, including legal or technical jargon
  • use confusing interfaces
  • de-emphasize options to decline tracking
  • make it more difficult for a visitor to decline tracking than to allow it, such as by requiring more steps

As legal and regulatory measures aimed at protecting children’s privacy continue to evolve, the landscape remains complex and dynamic. The recent developments highlight the challenges in balancing children’s online safety with constitutional rights, particularly regarding free speech. The Ninth Circuit’s decision in NetChoice v. Bonta underscores the importance of narrowly tailoring privacy regulations to avoid infringing upon First Amendment rights, a theme that is increasingly relevant as states introduce their child protection laws.

Meanwhile, enforcement actions against companies like TikTok demonstrate the ongoing commitment by U.S. regulators to uphold existing laws, such as COPPA, while international bodies also hold firms accountable under broader privacy frameworks like the GDPR. These actions reflect a growing, global consensus on the need for stringent protections of children’s data, even as the specifics of compliance and enforcement vary by jurisdiction.

Overall, these updates underscore the urgency for businesses to stay ahead of evolving privacy standards and the importance of robust compliance strategies that respect both privacy and free speech. As regulations continue to develop, companies must remain vigilant, adaptable, and prepared to meet the dual demands of safeguarding children’s privacy while navigating the intricate legal landscape that governs data protection.

Disclaimer

This material is provided for informational purposes only. It is not intended to constitute legal advice nor does it create a client-lawyer relationship between Hall Booth Smith, P.C. and any recipient. Recipients should consult with counsel before taking any actions based on the information contained within this material. This material may be considered attorney advertising in some jurisdictions. Prior results do not guarantee a similar outcome.

Blog Overview

Subscribe for Updates

About the Author

Jade Davis

Jade Davis

Partner | Tampa Office

T: 813.329.3890
E: jdavis@hallboothsmith.com

Jade Davis focuses her practice on data privacy, cyber security, and construction matters. Jade provides strategic privacy and cyber-preparedness compliance advice and defends, counsels, and represents companies on privacy, global data security compliance, data breaches, and investigations.

Leave a comment