Image of computer coding. Some of the coding is blurred.

THE CALIFORNIA AGE-APPROPRIATE DESIGN CODE

Image Credit: Markus Spiske from Unsplash

***Update: On September 15, 2022, Governor Newsom signed AB 2273, establishing the California Age-Appropriate Design Code Act.

Who It Covers, What It Requires & How It Compares to the UK

Effective July 1, 2024, the California Age-Appropriate Design Code imposes obligations on businesses[1] that provide an “online service, product, or feature” that is “likely to be accessed by children.”[2] Children are defined as California residents[3] “who are under 18 years of age.”[4] The law provides factors for whether an online service, product, or feature (S/P/F) is “likely to be accessed” by California residents under the age of 18:[5]

  • It is directed to children as defined by COPPA.[6]
  • It is determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children, or it is substantially similar to an online S/P/F that meets this factor.
  • It displays advertisements marketed to children.
  • It has design elements known to be of interest to children, including games, cartoons, music, and celebrities who appeal to children.
  • Based on internal research, a significant amount of the audience is children.

An online S/P/F is defined by what it is not, and the definition notably exempts the “delivery or use of a physical product.”[7] This exemption diverts from the UK version of the law, which covers “connected toys and devices.”[8]

Compared to the UK’s Common-Sense Approach

The US version of the law provides no guidance on what it means for a “significant number of children” to “routinely access[]” the online S/P/F. However, the law makes clear in its legislative findings that covered businesses may look to guidance and innovation in response to the UK version when developing US-covered online S/P/F.[9]

ICO states that the term “likely to be accessed by” is purposefully broad, covering “services that children [are] using in reality,” not just those services specifically targeting children.[10] However, ICO recognizes that the term is not so broad as to “cover all services that children could possibly access.”[11] The key difference is whether it is “more probable than not” that an online S/P/F will be accessed by children, and businesses should take a “common sense approach to this question.”[12]

To illustrate this point:

  • If an online S/P/F is the kind “you would not want children to use in any case,” then the business should focus on preventing children from accessing the online S/P/F, rather than making it child friendly.[13]
  • If a business’s common-sense analysis reveals that children make up a “substantive and identifiable user group” routinely accessing the online S/P/F, then the “likely to be accessed” definition will apply.[14]
  • If that analysis does not reveal such a group yet causes the business to “think that children will want to use it,” then the business “should conform to the [law’s] standards.”[15]
  • If a business decides that the online S/P/F is not likely to be accessed by children, the business should “document and support” the reasons for such a determination, and incorporate such evidence as “market research, current evidence on user behaviour, the user base of similar or existing service,” and more.[16]

While this does not specify a threshold for what constitutes a “significant number of children,” it does demonstrate ICO’s view on the breadth of the law’s application.

In sum, businesses should make a common-sense determination — based on actual evidence (e.g., internal or market) — as to whether it is more probable than not for a substantive and identifiable user group of children to either routinely access or want to access the online S/P/F.

Top 3 Pain Points for Businesses

If a business makes such a determination and finds that their online S/P/F is covered by the law, the business must take several steps to ensure compliance. We identified the following as among the more onerous steps that must be taken.

  1. Data Protection Impact Assessments & Risk Mitigation Plans

Before offering any new online S/P/F likely to be accessed by children, the business must complete a Data Protection Impact Assessment (DPIA) for it and maintain DPIA documentation for as long as the online S/P/F is likely to be accessed by children.[17] Businesses must biennially review all DPIAs. Businesses must further document any risk of material detriment to children that arises from data management practices identified in the DPIA and create a timed plan to mitigate or eliminate the risk before the online S/P/F is accessed by children.[18]

  1. Estimate Age of Child Users or Treat All Consumers as Children

Covered businesses must estimate the age of child users with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business or apply the privacy and data protections afforded to children to all consumers.[19] The law provides no further guidance on how one makes such an estimation, but ICO published guidance for the UK version.[20]

  1. High Privacy & Tracking Signals as Default Settings for Children

Covered businesses must configure all default privacy settings provided to children by the online S/P/F to settings that offer a “high level of privacy,” unless the business can demonstrate a compelling reason that a different setting is in the best interests of children.[21] If the online S/P/F allows a parent, guardian, or other consumer to track the child’s location, it must also provide an “obvious signal” to the child when the child is being tracked or monitored.[22]


[1] The law applies to “businesses” as defined by the California Consumer Privacy Act (CCPA), 1798.140(c).

[2] 1798.99.31(a).

[3] The law incorporates the CCPA’s definition for “consumer,” 1798.140(g).

[4] 1798.99.30(b)(1).

[5] 1798.99.30(b)(4).

[6] Which means:

  • A commercial website or online service that is targeted to children; or
  • That portion of a commercial website or online service is targeted to children. 15 U.S.C. § 6501(10)(A).

[7] 1798.99.30(b)(5), which also exempts broadband internet access service and telecommunications service.

[8] According to ICO, connected toys and devices are “children’s toys and other devices which are connected to the internet. They are physical products which are supported by functionality provided through an internet connection.” https://ico.org.uk/for-organisations/guide-to-data-protection/ico-codes-of-practice/age-appropriate-design-a-code-of-practice-for-online-services/14-connected-toys-and-devices.

[9] AB 2273, Sec. 1(d).

[10] https://ico.org.uk/media/for-organisations/guide-to-data-protection/key-data-protection-themes/age-appropriate-design-a-code-of-practice-for-online-services-2-1.pdf, at 17.

[11] Id.

[12] Id, at 17-18.

[13] Id, at 18.

[14] Id.

[15] Id.

[16] Id.

[17] The eight DPIA requirements can be found at 1798.99.31(a)(1)(B).

[18] 1798.99.31(a)(2).

[19] 1798.99.31(a)(5).

[20] These methods include the user self-declaring their age, AI algorithms establishing a user’s age, third-party verification services, confirmation from a known adult account holder, hard identifiers (e.g., passports or similar documents), or some form of technical measures. https://ico.org.uk/for-organisations/guide-to-data-protection/ico-codes-of-practice/age-appropriate-design-a-code-of-practice-for-online-services/3-age-appropriate-application.

[21] 1798.99.31(a)(6).

[22] 1798.99.31(a)(8).