Demonstrating that online privacy for teens and children continues to be an area of bipartisan concern at the federal and state level, Sen. Richard Blumenthal (D-CT) and Sen. Marsha Blackburn (R-TN) introduced the federal Kids Online Safety Act on February 16, 2022 and California Assemblymembers Buffy Wicks (D) and Jordan Cunningham (R) introduced the California Age-Appropriate Design Code Act. Both proposed bills include elements from the United Kingdom’s Age Appropriate Design Code, which requires tech companies to prioritize the best interests of children when designing their products and services.
Kids Online Safety Act
The federal Kids Online Safety Act introduces product requirements, a duty of care, annual audits, new disclosure and transparency requirements, and independent research provisions for covered businesses.
The proposed bill would govern “covered platforms,” which are broadly defined as “a commercial software application or electronic service that connects to the internet and that is used, or reasonably likely to be used, by a minor.” Notably, the bill expands the threshold for applicability from the current legal standard under the Children’s Online Privacy Protection Act (COPPA) in two ways. First, the bill regulates platforms that are “reasonably likely” to be used by minors, while COPPA is triggered when an online service is directed towards children, or the service has “actual knowledge” that it is collecting personal information from a child. Second, the bill defines a minor as a person under 16 years old whereas COPPA applies to the collection of personal information from individuals under 13 years old.
If passed, the Kids Online Safety Act would require, among other things, the following:
Covered platforms must provide safeguards to parents and minors, including:
- The ability to disable product features, particularly those that are “addictive;”
- An opt-out of algorithmic recommendations (which is defined as a “fully or partially automated system used to suggest, promote, or rank information”);
- Default to the strongest privacy settings;
- Parental tools to supervise minors’ use of the platform, including safety settings, time tracking, and purchase limits; and
- A dedicated channel for parents to report harms to minors.
DUTY OF CARE
Covered platforms must “act in the best interests” of minors using their platform by preventing and mitigating the risks of specific harms, including self-harm, suicide, eating disorders, substance abuse, sexual exploitation, and unlawful products for minors (such as gambling and alcohol).
Covered platforms must provide:
- New disclosures related to privacy policies, algorithmic recommendation systems, and advertisements;
- An annual public, independent audit identifying the risks to minors that describes the steps the platform is taking to prevent and mitigate harms; and
- Access for academic researchers and non-profit organizations to data collected by the platform to conduct public interest research regarding harms to the safety and well-being of minors.
California Age-Appropriate Design Code Act
The California Age-Appropriate Design Code Act would require companies with goods, services, or product features likely to be accessed by a child to, among other things, (i) conduct data protection impact assessments, (ii) provide privacy information and other materials in language suited to children, and (iii) default to the strongest privacy settings. The bill also prohibits the collection of precise geolocation data unless the business can demonstrate a compelling reason that doing so would be in the best interests of the child, and the use of dark patterns to lead or encourage users to provide personal information beyond what is necessary to provide that good or service or to forego privacy protections.