Don’t Judge a Bill by its Title: How the Kids Online Safety Act Fails to Keep Kids Safe Online
One of the oldest tricks in the book of politics is to name a bill something so noble that any who oppose it feel embarrassed. Case in point, the Kids Online Safety Act (KOSA) has been championed by its authors as the solution to the dangers that come with children using the internet, while the reality of the bill is much less hopeful.
Originally proposed in 2022, KOSA has been repeatedly rejected and amended over the years. This is because, despite its name, violations of privacy and rights are guaranteed while the bill’s ability to actually protect children online is not. In an effort to move the bill forward, KOSA was suggested as an (unrelated) amendment to the Federal Aviation Authorization Act (FAA) of 2024. Although the Kids Online Safety Act was not included in the final version of the FAA, it’s worth examining the bill behind the name that just won’t go away.
Under KOSA, companies are implicitly obligated to verify users’ age, a process which requires large amounts of personal data not just from kids, but from everyone to prove he or she is not a minor. The wording of the most recent version of KOSA is careful to not explicitly mandate age verification; instead, it states that companies are liable for any harm minors experience as a result of content accessed on their platforms.
To avoid liability, companies will need to collect private information about their users—an extremely costly process that is invasive for all who use their services. Smaller online companies will be priced out of compliance, making large platforms that demand age verification the defining online experience. But it doesn’t stop with age verification, companies will have to continue collecting data on the activity of the users it has identified as minors in order to maintain compliance.
Once users have been sorted into their respective age groups, KOSA compliance dictates that platforms follow a vague and broad “duty of care” standard wherein content deemed inappropriate for kids is blocked. The bill requires: “A covered platform ‘shall take reasonable measures in the design and operation of any product, service, or feature that the covered platform knows is used by minors to prevent and mitigate the following harms to minors,’” and then lists six areas:
- mental health disorders;
- addiction;
- physical violence, online bullying, and harassment;
- sexual exploitation and abuse;
- promotion and marketing of drugs, tobacco, gambling, and alcohol; and
- financial harms.
But who determines what does and does not fall into these broad categories and what measures are “reasonable?” The legislation authorizes the Federal Trade Commission (FTC) to issue” guidance” to “assist” covered platforms in complying with the requirements of the bill, should it become law. How far will the FTC’s assistance go? If recent history is any indication, companies are in for more government help than they find helpful.
When the government asserts itself as the authority on what speech is appropriate, disarray follows. In their letter opposing KOSA to Congress, the Taxpayers Protection Alliance supplied one such example, “debates on the cause and treatment for complex issues such as eating disorders, substance abuse, or depression could be censored, resulting in those suffering and searching for assistance unable to research.” The “duty of care” is indeed a highly subjective standard that has the potential to stifle speech that is both constitutionally protected and helpful, including to minors.
KOSA promises big things, but just can’t live up to its promise to keep kids safe online. It’s nearly impossible to enforce, infringes upon the privacy of all users, and fails to strike a balance between censorship and age-appropriate content moderation. The bill’s recent failure as an amendment is an invitation for those lawmakers concerned about children’s online safety to step back and evaluate what effective resources are available to those who actually can act to keep safe—their families.
A good start is encouraging parents to use the ever-growing selection of children’s online safety tools. These tools can empower parents and other caregivers, protect kids, and preserve the privacy and rights of platform users. Furthermore, lawmakers can prioritize legislation that combats the bad actors who use social media to prey on children. Families deserve tools and laws that live up to their name, not increased regulation that doesn’t address the core issues.