Georgia Joins the list of States with Age Verification Requirements for Online Speech: Navigating the Crossroads of Child Protection and Individual Rights

In an era where children are increasingly immersed in the digital realm, concerns about their safety and privacy online have become a focal point for policymakers (even on both sides of the aisle) and parents alike. Several states have responded to these concerns by introducing legislation aimed at enhancing child protection on social media platforms. However, the implementation of such laws is fraught with complexities, raising profound questions about children’s autonomy, rights to free expression, and constitutional validity. While well-intentioned, these laws place a considerable burden on companies who operate social media platforms or provide digital services to minors. We examine the most recent social media law and its significant implications below.

Georgia’s Legislative Mandate

On April 23, 2024, the Governor of Georgia signed S.B. 351, known as the Protecting Georgia's Children on Social Media Act of 2024, into law. Under, S.B. 351, in-scope social media companies will be required to make "commercially reasonable efforts" to verify account holders' ages and prohibit individuals under 16 years of age from using their services unless they obtain the parent or guardian’s "express consent,” including written forms, toll-free calls, videoconferences, and verified emails. Social media companies will be required to offer a description of content moderation features and allow parents to alter or disable these features. Other provisions of the law include:

  • Targeted Advertising Prohibited; Some Contextual Advertising Allowed: For minors with parental consent, social media platforms are prohibited from displaying advertisements based on personal information, except for age and location. Furthermore, the collection or use of minors’ personal data is limited to what is necessary.
  • Penalties for Non-Compliance: Violations of the law can result in enforcement actions and damages of up to $2,500 per incident. Companies will have a 90-day period to address alleged violations before enforcement actions are initiated by the Attorney General. Rather than worrying about fines or verifying age and parental relationship, however, it would likely be more probable that social media companies would pull out of Georgia or offer a different platform experience to users in Georgia.
  • School-Directed Services Beware: The law requires local school boards and charter school governing bodies to adopt policies that restrict social media access on school devices, with certain exceptions for educational purposes under supervision. These policies must be approved by the Georgia Department of Education.
  • Parental Controls on School-Issued Devices: The law also mandates the establishment of technology protection measures that allow parents to manage internet access on school-issued devices when off school property.


S.B. 351 is set to go into effect on July 1, 2025.

Similar laws in other states, including Ohio, Arkansas, and California, have already faced legal challenges (including injunctions). For example, a federal judge blocked a nearly identical law in Ohio that would have required children under 16 to obtain parental consent before creating social media accounts. That law was originally scheduled to go into effect on Jan. 15, 2024.

Legal Risks

At the heart of the debate surrounding these laws lie constitutional questions about the balance between child protection and individual rights. The intention is to shield children from harmful or inappropriate content or shield children from allegedly addictive apps. By compelling platforms to filter or censor content deemed inappropriate for minors, however, these laws risk encroaching upon individuals' rights to access information and engage in online discourse freely, thus implicating First Amendment principles. That is, requiring identification to access information and channels of communications bluntly restricts speech in response to any harms to teens that could arise from social media usage.

Moreover, giving parents too much control over what content children and teens access creates risks—especially for teens in marginalized groups. For example, an LGBT teen with disapproving parents might be prevented from accessing content that interests them and could feel isolated. Other parents might stop teens from using social media from expressing political beliefs or support for social causes—running afoul of teens’ long-establish First Amendment rights in political speech.

State-level social media regulation also presents compliance headaches and constitutional risks. Navigating the regulatory landscape becomes increasingly complex when considering existing federal regulations, such as the Children's Online Privacy Protection Act (COPPA). Companies who provide digital services to minors must meticulously balance compliance with COPPA's provisions while addressing the demands of state laws, which may diverge on key issues related to data retention, age verification requirements, and parental consent.

A particularly contentious aspect arises from COPPA's stipulation regarding the retention of children's personal information. While the Act mandates limitations on data retention to safeguard minors' privacy, state laws may necessitate prolonged retention periods for records of parental consent and related data. This misalignment poses significant challenges for platforms, as they grapple with reconciling competing retention requirements while upholding COPPA's privacy safeguards. Such misalignment could also make age-restriction laws subject to dormant commerce clause channels by substantially burdening social media providers’ ability to nationally offer services.

Furthermore, the requirement for verifiable parental consent introduces additional complexities, necessitating the collection of sensitive personal information and requiring the users to produce government documents. While this data is crucial for age verification and parental consent purposes, its collection violates the recent pushes in privacy laws to minimize the amount and sensitivity of data that online tools collect. Companies who offer digital services to minors would thereby incur heightened data governance and cybersecurity requirements for the sole purposes of complying with such a law.

In navigating this intricate regulatory landscape, social media platforms find themselves at a crossroads between fulfilling their obligations to protect children and respecting individuals' rights to access information and engage in online discourse freely. As legislative and technological advancements continue to shape the regulatory environment, stakeholders must collaborate to strike a delicate balance that safeguards both minors and fundamental rights in the digital age. Failure to do so risks undermining the very principles that these regulations aim to uphold and a state-by-state patchwork of such laws could create an unworkable compliance environment.

Knowledge assets are defined in the study as confidential information critical to the development, performance and marketing of a company’s core business, other than personal information that would trigger notice requirements under law. For example,
The new study shows dramatic increases in threats and awareness of threats to these “crown jewels,” as well as dramatic improvements in addressing those threats by the highest performing organizations. Awareness of the risk to knowledge assets increased as more respondents acknowledged that their