Back to News|Standards

Global Efforts to Make a Safer and Better Internet for Children

|Source: In Compliance Magazine
Global Efforts to Make a Safer and Better Internet for Children

The Advancement of Age-Appropriate Design and Age Verification Standards

Children now make up roughly one-third of all internet users worldwide. Greater connectivity has enabled young people to benefit from educational content, communication tools, and entertainment, but it has also exposed them to harmful content, exploitation, data privacy risks, and potential addiction problems. The United Nations has confirmed that its Convention on the Rights of the Child1 also applies to the digital world.

One area gaining strong focus is age verification. For policymakers, age verification is a critical tool to enforce child protection laws and uphold digital rights. For digital service providers, it’s a strategic imperative for ensuring compliance, building trust, and protecting brand integrity.

The IEEE Standards Association has been at the forefront of age-appropriate design and age verification standards and certification. As technology advances, our commitment to safer and more responsible online spaces must grow too, in which accessing such spaces in an age-appropriate manner isn’t just a legal requirement, but a social responsibility. IEEE standards provide a practical and implementable foundation for this.

- Partner Content -

How to Become A Great Compliance Engineering Professional

The journey to becoming an excellent compliance engineering professional is an ongoing process of continuous learning, practical skill development, and unwavering commitment to ensuring product safety, regulatory compliance, and maintaining the highest standards of professional integrity.

Age Verification is Not a New Calling

The concept of verifying age to protect children began long before the advent of the internet. In the United States, for example, age verification was used as a tool to enforce child labor laws such as New York’s 1903 labor law, which eventually led to federal birth certificate standards. Since then, other federal laws have been enacted that set age requirements for purchasing tobacco products, e-cigarettes, medical marijuana, and pharmaceuticals, or watching a movie in a theater, or gambling online.

Since the internet became widely available to the public in the early 1990s, society has gained incredible benefits but also inherited significant risks. Over the past two-plus decades, we have witnessed and perhaps experienced first-hand the rise of all types of cybercrimes against governments, businesses, and individuals, as well as online addiction and behavioral risks.

Unfortunately, these risks also apply to our most vulnerable, our children. Risks to minors include online grooming, exposure to inappropriate content, privacy violations, cyberbullying, manipulative advertising, and online addiction. Some laws have addressed these issues to some degree; one significant U.S. federal law is the Children’s Online Privacy Protection Act (COPPA)2. COPPA, effective since 2000, establishes foundational rules for how online services must handle personal data from children under age 13. On April 22, 2025, the Federal Trade Commission (FTC) issued new amendments to COPPA, updating its requirements to reflect evolving digital practices, which took effect on June 23, 2025, and require full compliance by April 22, 2026.

At the federal level in the U.S., proposed legislation attempting to control online content and access of minors has failed. The roadblocks are tied to a complex mix of constitutional concerns, privacy risks, technical challenges, and political fragmentation. Defeats include the Communications Decency Act (1996), struck down by the U.S. Supreme Court in 1997, and the Child Online Protection Act (1998), which never went into effect due to ongoing legal challenges, as well as more current efforts, such as the Kids Online Safety Act (2023), have not passed.

Significant and effective efforts in the U.S. to enact online age verification mandates have fallen to the states, resulting in a patchwork of state-level laws requiring age verification for access to adult content or social media platforms. Currently, about half of U.S. states have passed laws that are either in effect or await effective dates. These laws vary widely in scope and enforcement. For example, states such as Louisiana, Texas, Utah, and Arkansas require websites with sexually explicit content to verify users are 18 or older using government-issued ID or third-party services.

- From Our Sponsors -

These state laws have met legal challenges. Notably, in June 2025, the U.S. Supreme Court upheld a Texas law3 requiring age verification for access to websites hosting sexually explicit material. The Court ruled 6-3 that the law, House Bill 1181, does not violate the First Amendment, finding that the state’s interest in protecting children from harmful content outweighs the burden on adults seeking access to such material. The ruling is significant because it marks the first time the Supreme Court has allowed age verification requirements on adult consumers to protect minors online. 

Some states have passed laws focused on social media restrictions, imposing age verification and/or parental consent and other age-appropriate design features such as curfews, time limits, and privacy settings for children of various ages.

Age Verification is a Global Issue

In the European Union, policymakers have taken a more aggressive approach and made significant progress toward implementing harmonized age verification laws across member states. Like COPPA in the U.S. and the Online Safety Act (OSA) in the United Kingdom4, the Digital Services Act in the European Union5 officially recognizes minors as a distinct risk group that requires stronger safeguards.

In other parts of the world, countries are rapidly adopting online age verification laws. For example, Australia’s eSafety Commissioner conducted age verification trials in 2023. A proposed ban on social media use by children under 16 is scheduled to take effect in December 2025, and industry age assurance codes to support enforcement are being drafted.

China requires real-name registration and facial recognition for youth gaming, limited to three hours of gaming per week. While pornography is banned, age verification is also applied to gaming and social platforms.

Japan uses the “My Number” system for age checks across games, manga, and video platforms. It also enforces limits on gacha microtransactions to protect minors. There is no blanket ban on social media use, but platforms engage in voluntary self-regulation.

However, enforcement and infrastructure vary widely. In many developing regions, such as parts of Africa and Southeast Asia, age verification laws are either absent or inconsistently enforced. This highlights the need for international cooperation and capacity building to ensure that children everywhere benefit from the same level of protection.

A Standard for Age Verification: IEEE 2089.1-2024

Published in May 2024, the IEEE 2089.1™, Standard for Online Age Verification6 was developed by industry experts from around the world. It is part of the IEEE 2089 series of standards, which was initiated by and based on the 5Rights Principles for children, focusing on age-appropriate design of digital services and helping to build a better digital world for children.

The IEEE 2089.1 standard was developed to provide a set of processes for digital services to verify or estimate a user’s age or age range with a proportionate degree of accuracy and certainty in determining a child’s age. This allows organizations to manage access to their products and services based on the suitability of age, keeping the rights and needs of children in mind.

  • Age verification can help digital services providers, policymakers, and regulators to address requirements from various jurisdictions around the world, including:
  • Age-Appropriate Design Codes in the UK, California, and more;
  • Europe’s Digital Services Act, General Data Protection Regulation (GDPR), and Audiovisual Media Services Directive (AVMSD);
  • US’s Children’s Online Privacy Protection Act (COPPA); and
  • India’s Digital Personal Data Protection Act (DPDPA)

These processes associated with digital services are essential in creating a digital environment that supports, by design and delivery, children’s safety, privacy, autonomy, agency, and health. The program specifically provides a set of guidelines and best practices that offer a level of validation for Age Assurance decisions that may be either required by law or voluntarily implemented for business or social reasons.

A Step Further: Age-Appropriate Design

Countries around the world are increasingly adopting age-appropriate design code policies inspired by pioneering frameworks such as the United Kingdom’s Age-Appropriate Design Code and multiple similar frameworks based upon it. These policies aim to safeguard children’s rights and well-being in digital environments by mandating privacy-by-design principles for online services likely to be accessed by minors. The UK’s code, enacted in 2020 and enforceable since September 2021, set a global precedent with its 15 standards rooted in the UN Convention on the Rights of the Child, influencing similar initiatives in Ireland, Sweden, Indonesia, and Australia.

No federal policy exists in the United States. Therefore, it is up to the states to consider similar legislation, with California, Maryland, and Vermont having modeled after the UK’s Age Appropriate Design Code. These frameworks emphasize high privacy settings by default, age assurance, data protection impact assessments, and transparency tailored to children’s understanding. As digital platforms increasingly shape young people’s lives, these codes represent a growing global consensus on the need for robust, child-centric data governance.

The IEEE SA’s focus on human-centric design and the need to protect children online has resulted in the development of advanced frameworks for the age‑appropriate design of internet platforms7 to support this growing demand. This community is helping organizations design and develop digital services that are age-appropriate and deliver against the cultural and legal privileges they are entitled to in digital environments.

Informed by the United Nations Convention on the Rights of the Child (CRC) and built upon the principles developed by the 5Rights Foundation8, IEEE 2089-2021, Standard for Age Appropriate Digital Services Framework9 establishes a recommended set of processes that help enable organizations to make their products and services age appropriate, including consideration of risk mitigation and management through the life cycle of development, delivery, and distribution.

IEEE SA recently launched the Technology Policy Collaborative to further leverage IEEE’s neutral, trusted expertise to help governments address complex technical and societal challenges in strategic areas, such as digital governance. As a model for collaborative policy development, IEEE SA announced the culmination of its collaboration with policymakers in Indonesia10 in May 2025 on the recently passed Indonesian Government Regulation, Governance of Electronic Systems in Child Protection.11

Seal of Assurance: IEEE Age Verification Certification Program

The IEEE SA offers a globally- and industry-respected Age Verification Certification Program12, which assesses the design, specification, evaluation, and deployment of age verification systems against the framework identified in the IEEE 2089.1, Standard for Online Age Verification. It is part of the IEEE 2089 series of standards deliverables, focusing on age-appropriate design of digital services and helping to build a better digital world for children.

The certification program provides technology organizations with the means to verify to regulators and consumers whether their age assurance systems work as intended. Governing bodies can also leverage the certification and standard when establishing age verification mandates. Online users and those responsible for the welfare of children can more easily identify brands and systems that are implementing age-appropriate measures that are compliant with the program’s established framework.

Looking Ahead

The advancement of age-appropriate design and age verification standards by IEEE SA is helping stakeholders, including digital services providers, policymakers, and regulators, to address requirements from various jurisdictions around the world. 

While many digital service providers only implement age-appropriate design and age verification measures to comply with governmental mandates, the opportunity, standards, and certification process are available to those who are willing to do so proactively and show their commitment to helping create an online environment that is not only safer for children but also respects their rights.

Endnotes

  1. “Convention on the Rights of the Child,” from the UNICEF website.
  2. “Children’s Online Privacy Protection Act,” posted to the website of the Attorney General of Texas
  3. Ruling by the Supreme Court of the United States, “Free Speech Coalition, Inc. et.al., v. Paxton, Attorney General of Texas,” June 27, 2025.
  4. “Online Safety Act: explainer,” posted to the website of Gov.UK, April 24, 2025.
  5. “The Digital Services Act,” posted to the website of the Commission of the European Union.
  6. “IEEE 2089.1-2024, IEEE Standard for Online Age Verification,” posted to the website of the IEEE Standards Association.
  7. See “Enabling Trustworthy Digital Experiences for Children,” posted to the website of the IEEE Standards Association.
  8. “Building the digital world that young people deserve,” posted to the website of the 5Rights Foundation.
  9. “IEEE Standard for an Age Appropriate Digital Services Framework Based on the 5Rights Principles for Children,” posted to the website of the IEEE Standards Association.
  10. “IEEE Provides Strategic Expertise as Indonesia Adopts First Age-Appropriate Design Regulations in Asia,” posted to the website of the IEEE Standards Association, May 14, 2025.
  11. “Governance of Electronic System Implementation in Child Protection,” translation of Indonesia’s Government Regulation (PP) Number 17 of 2025 on the Implementation of Electronic Systems in Child Protection.”
  12. “IEEE Online Age Verification Certification Program,” posted to the website of the IEEE Standards Association.