Blog 06 Jun 2024

Buyers Guide

Complete playbook to understand liveness detection industry

Learn More
How do KOSA, OSA & COPPA Mandate Children’s Online Safety through Age Verification?

How do KOSA, OSA & COPPA Mandate Children’s Online Safety through Age Verification?

Author: admin | 06 Jun 2024

Every year, more than 300 million children become the victim of online sexual abuse according to a report. It is a grim reality that every 1 out of 8 children having an online presence faces some type of non-consensual sexual image sharing or another manipulation tactic by the abusers. The study reveals that millions of children are forced to engage in online sexual activity with adults and other youth. These facts don’t just stop here as thousands of children went missing and a national watchdog received nearly 179000 cases of sexual images being shared online or hosted just in the UK. Various jurisdictions globally are taking critical steps in combating this social menace, which threatens the future of every child with internet access. 

Pertinent to the above discussion, let’s attempt to understand how multiple privacy protection and online safety acts can help ensure children’s online safety. 

Insights into Kids Online Safety Act (KOSA) 

KOSA, a bill intended to protect children’s safety in the digital world, was introduced in the Senate by Richard Blumenthal and Marsha Blackburn who both serve as senior senators in the United States. It was introduced in the US Senate on 2 May 2023 and is currently under consideration by the Commerce Committee of the Senate. According to its mandate, it intends to bind social media platforms to take concrete steps in the interest of internet users under 17 years of age. 

KOSA is a proposed law mandating social media platforms to make the internet a safer world for kids. If it becomes a law, it would require 

  • Social media platforms design their websites in such a way that online threats and potential risks to kids are significantly reduced. 
  • The platforms must let users turn off recommendation systems that automatically suggest based on algorithms like what they’ve viewed or liked before to restrict kids’ access to inappropriate content. 

If social media platforms fail to comply with the guidelines and outline, they have to face serious legal actions or consequences. 

This infographic shows the key provisions of KOSA including age range, duty of care, research & guidance, parental tools, and regulatory authority responsible for enforcement.

Is it True that KOSA Could Do More Harm than Good? A Closer Look 

The act comes forth as a safeguard for minors protection against the harmful content impacting their mental health, however, decision-makers also emphasize the challenges associated with its implementation. KOSA mandates covered platforms ranging from social media platforms to streaming services or video games to act best in the interest of minors without knowing whether the service user is a minor or not. It simply mandates these platforms to publish content that is safe for minors. 

To avoid legal consequences, many platforms may shortly begin or some have implemented robust age verification technology to confirm services are granted to the appropriate individuals and minors aren’t allowed access to inappropriate content. 

As age estimation requires users to provide their personal information to confirm the authenticity, often it requires biometric information or sensitive information for accurate age estimation, parents might discourage their children from signing up for a lesser-known platform. Parents’ reluctance to share kids’ sensitive information could limit kids’ access to beneficial or value-driven content. 

Note: KOSA is yet to be accepted as a law.

The UK’s Online Safety Act

The equivalent of the KOSA Bill in the UK is the Online Safety Act which also addresses the concerns related to online harm to children and adults. It aims to mandate social media platforms act responsibly to protect their users from potential harm. The act passed into law on October 26, 2023, and puts a range of responsibilities on social media platforms or online service providers to make the internet safe for minors. 

  • Online Safety Act protects kids by ensuring that social media platforms restrict kids from accessing age-inappropriate and harmful content or services. The act also mandates platforms to make it effortless for children or parents to reach out in case of any problem or issue. 
  • The act safeguards adults by ensuring that major platforms are transparent about the type of content they are producing and enabling users with more control over what they want to watch, enhancing overall internet safety. 

Office of Communication (Ofcom), an independent regulatory authority in the UK, ensures that social media platforms and search services comply with the law, making their platforms secure and reliable.  

This infographic shows the key provisions of OSA including scope, duty of care, age verification measures, penalties for non-compliance, and regulatory authority.

Which Entities are Required to Comply with the Act?

Any business or platform providing services online, which could be websites, apps, or other digital platforms is mandated to comply with the law. On a large scale, the covered entities under the act include 

  • User-to-user-services (social media, video gaming, or chatting apps)
  • Search services 
  • Online Adult Content Producers 

From small to big businesses, even individuals operating online services come under the umbrella of complying with the act. 

Under the Act, all service providers are mandated to implement robust age verification or age estimation to make sure that minors are restricted from accessing adult or explicit content harmful to youth. Service providers employing age assurance technology or technique must keep in mind that the tools are advanced enough to verify users’ age accurately and the technology they are deploying is reliable and fair. 

Gain More Insight: Find out how age verification Laws affect businesses. Observe legal requirements and guarantee safe, secure access to services and material intended only for a certain age.

Looking into Children Online Privacy Protection Act (COPPA)  

COPPA is a United States federal law, effective from April 21, 2000, outlining specific guidelines for website and online content producers when it comes to the collection of sensitive information from kids under 13. US Congress passed the law in 1998 and the Federal Trade Commission (FTC) is responsible for compliance enforcement. 

This infographic shows the key provisions of COPPA including age range, data collection, privacy policy, data security, parental rights, regulator, and penalties for non-compliance.

COPPA imposes certain requirements on online content producers and websites to

  • Ask parents or guardians for consent before collecting sensitive data required for effective age assurance for users under 13. 
  • Develop a privacy policy outlining when or how to acquire consent from parents or guardians before granting access to youngsters.
  • Declare what protocols they are employing to safeguard the sensitive information of children and what measures they are taking to make the online world secure for youth.  
  • Restricts the marketing of minors, particularly under the age of 13. 

Non-compliance with the law may lead to websites or online content providers facing severe penalties or heavy fines, for any violation the fine could reach $50,120. 

Note: An updated version of COPPA, informally known as COPPA 2.0, enacted in the 118th Congress in 2023, is proposed to increase the threshold age limit from 13 to 16.

Fortifying Age Assurance: How Facial ID Proofing Solutions Can Facilitate Compliance with These Acts? 

The Internet, acting as powerful for learning, education, communication, earning, and entertainment, imposes far-reaching consequences on the mental health of minors accessing inappropriate content or falling victim to cyberbullying or online abusers. Traditional methods of verifying users’ age like asking for name, age, or date of birth are effortlessly evaded by tech-savvy youth. This circumvention calls for the implementation of advanced verification tools like facial age assurance tools to accurately authenticate minors, enhancing accuracy and elevating online safety. 

  • To effectively comply with these acts, social media platforms, online businesses, websites, or online content producers must deploy facial age verification, leveraging AI & machine learning models to estimate the user’s age based on their facial attributes. 
  • Adopting self-sovereign identity (SSI), a decentralized game-changing identity approach, gives users more control over their data, promotes seamless ID verification, and facilitates businesses to comply with privacy regulations. SSI enables users to share selective information reducing the risks of data breaches and ID theft, effectively mitigating privacy concerns often provoked while providing sensitive information for authentication. 

Businesses must confirm the accuracy and reliability of facial age estimation solutions before the implementation to ensure effective compliance with laws and make the internet safer for Kids.  

Empower compliance with kids’ privacy protection laws & regulations and enhance online safety by deploying Facia, an AI-powered facial recognition tool integrated with sophisticated age verification software, safeguarding minors from looming threats. Facia verifies the age of the users by analyzing facial features and matching them against the existing databases for accurate verification. It doesn’t retain sensitive information for a longer time, mitigating the risks of potential data breaches or violation of privacy rights.  

Check Out More: Enhancing Texas alcohol laws through digital age verification, ensuring compliance and safety. Discover how technology is transforming regulatory practices in the alcohol industry.

Frequently Asked Questions

Why is age verification important for children’s online safety?

Accurate age verification is crucial to ensure that minors are restricted from accessing inappropriate content or purchasing age-restricted products. It also safeguards minors from the menace of online abusers seeking to exploit children, protects minors from targeted marketing, and ensures compliance with regulatory standards.

What is KOSA?

KOSA (Kids Online Safety Act) is a bill introduced by senior senators in the United States to ensure the safety of children in the online world. This act is yet to be passed to become a law, if it becomes a law, it would require social media platforms to reduce online risks for minors by changing their website design or opting out of algorithm-based recommendation systems.

Who introduced KOSA Bill?

KOSA was introduced by senior senators, Richard Blumenthal and Marsha Blackburn, in the US Senate on 02 May, 2023.

What is OSA?

The UK’s Online Safety Act passed into law on October 26, 2023, puts a range of responsibilities on online service providers and social media platforms to protect kids and adults from potential harms imposed by inappropriate content and make the internet a safe world for minors.

What is COPPA?

COPPA (Children Online Privacy Protection) is a federal law in the US, effective April 21, 2000, that requires website and online content providers to ask parental consent before acquiring personal information from minors under 13, develop a transparent privacy policy, and protect children from targeted marketing.

How does age verification work under these laws?

These laws strictly emphasize online content producers, search services, and social media platforms to establish transparent privacy policies and mandate these platforms to confirm the age of users before granting them access to services by employing robust age verification solutions.