No Child’s Play: States expand child protection online (Part II)

Last week, we discussed action taken by three states, Texas, California, and Ohio, to enhance protection of children’s data online. In this second installment, we shift our attention to address the 2023 legislative efforts of three additional states: Utah, Arkansas, and Connecticut.

Utah Social Media Regulation Act

Utah became the first state to pass laws limiting how children can use social media. On March 23, Gov. Spencer Cox (R) signed into law Senate Bill 152 and House Bill 311, enacting the Utah Social Media Regulation Act.

The USMR Act applies to “social media companies.” A “social media company” is defined as an entity providing a social media platform with at least 5 million account holders worldwide that is an “interactive computer service.” ”Interactive computer service” includes web services, web systems, websites, web applications, and web portals.

Once the USMR Act takes effect on March 1, 2024, social media companies will be required to do the following:

  • Obtain consent from a parent or legal guardian for Utah users under the age of 18.
  • Allow parents or guardians full access to their children’s social media accounts.
  • Protect minors’ accounts from unapproved direct messaging.
  • Verify the age of a Utah individual seeking to maintain or open a social media account.
  • Block minors’ accounts from search results.
  • Create a default curfew setting that blocks access to minors’ accounts during the hours of 10:30 p.m. to 6:30 a.m., which parents or guardians can adjust.

Furthermore, social media companies are not permitted to (1) collect a minor’s data, (2) target a minor’s social media account for advertising, or (3) target a minor’s social media account with addictive designs or features.

Beginning on March 1, 2024, the Division of Consumer Protection will be authorized to enforce violations of the USMR Act.

Arkansas Social Media Safety Act

On April 7, shortly after enactment of the USMR Act, Gov. Sarah Huckabee Sanders (R), signed into law Senate Bill 396, now known as the Social Media Safety Act. The SMSA does not allow Arkansas minors to have social media platform accounts without the express consent of their parents or legal guardians. As a result, similar to Texas’ Securing Children Online through Parental Empowerment Act, the Arkansas law requires social media companies to verify account holder ages, and, if the account holder is a minor, obtain the express consent of the minor’s parent or legal guardian.

For age verification, social media companies must engage a third-party vendor to perform reasonable age verification methods, which include the following:

  • A digital identification card, including a digital copy of a driver’s license.
  • A government-issued identification.
  • Any commercially reasonable age verification method.

The SMSA applies to “social media companies.” A “social media company” is defined as “an online forum that a company [which has at least $100,000,000 in annual gross revenue] makes available for an account holder to: (i) Create a public profile, establish an account, or register as a user for the primary purpose of interacting socially with  other profiles and accounts; (ii) Upload or create posts or content; (iii) View posts or content of other account  holders; and (iv) Interact with other account holders or users, including without limitation establishing mutual connections through request and acceptance.”

Social media companies that violate the SMSA may be subject to penalties of $2,500 per violation. The SMSA grants the Arkansas Attorney General the authority to initiate enforcement actions for violations of this law.

The SMSA was supposed to become effective on September 1, but federal District Court Judge Timothy L. Brooks issued a preliminary injunction on August 31 blocking the law from taking effect. The preliminary injunction was granted in relation to the NetChoice LLC v. Tim Griffin case. NetChoice LLC is a technology industry trade group whose members include TikTok, Meta, and X (formerly known as Twitter). NetChoice asserted that (1) the SMSA violated individuals’ First Amendment rights, (2) the age-verification requirements did not provide a constitutional way to address the dangers to minors online, and (3) the SMSA did not adequately define which social media companies or platforms would be subject to its regulations.

We will keep you posted on the status of this litigation.

Connecticut Act Concerning Online Privacy, Data and Safety Protections

On June 26, = Gov. Ned Lamont (D) signed into law An Act Concerning Online Privacy, Data and Safety Protections. Among its objectives, the law – known as “the CT Act” -- establishes additional requirements regarding minors’ personal data and social media accounts. As of July 1, controllers under the CT Act cannot process sensitive data concerning a known child, without processing that data in compliance with the federal Children’s Online Privacy Protection Act. In addition, the CT Act codifies the existence and prescribed duties of the Connecticut Internet Crimes Against Children Task Force. This provision took effect July 1.

Beginning July 1, 2024, the CT Act will require social media platforms to do the following:

  • Within 15 business days of receiving a request from a minor or – if the minor is under the age of 16, that minor’s parent or legal guardian – the social media platform must unpublish the minor’s social media account.
  • If a social media platform receives a request to delete a minor’s social media account, the social media platform must delete the account and halt any processing of that minor’s personal data within 45 business days unless preservation is otherwise permitted or required by applicable laws.

 “Social media platform” is defined as an internet-based service or application that

  • Is used by a Connecticut consumer,
  • Is primarily intended to allow users to socially interact within the service, and
  • Enables users to (1) construct a public or semi-public profile for using the service, (2) populate a public list of other users with whom they share a social connection, and (3) create or post content in a manner that is viewable by other users.

Violations of the CT Act violate the Connecticut Unfair Trade Practices Act but are solely enforced by the Connecticut Attorney General. The CT Act explicitly states that the private right of action and class action provisions of the Unfair Trade Practices Act do not apply to violations of the CT Act.

The Constangy Cybersecurity & Data Privacy team assists businesses of all sizes and industries with implementing necessary updates to their privacy and compliance programs to address these complex and evolving regulatory requirements. If you would like additional information on how to prepare your organization, contact us directly at

  • Julie  Hess

    Julie Hess is a partner in the Philadelphia office of the Constangy Cyber Team. She assists clients in responding to all types of data security incidents by conducting initial assessments of the issues and helping to facilitate the ...

  • Rebecca  Pollack

    Rebecca is a member of the Constangy Cyber Team, focusing her practice on advising clients regarding data privacy and cybersecurity matters. She leverages her business background and education in technology and privacy law to aid ...

The Constangy Cyber Advisor posts regular updates on legislative developments, data privacy, and information security trends. Our blog posts are informed through the Constangy Cyber Team's experience managing thousands of data breaches, providing robust compliance advisory services, and consultation on complex data privacy and security litigation. 


* indicates required
Back to Page