Social Media Regulations Will Not Help Parents

Social Media Regulations Will Not Help Parents

As COVID-19 recedes and we can interact physically again, many are concerned about the amount of time young people are spending online. These concerns are especially pronounced for young people spending time on social media. As a result, lawmakers and policy professionals have suggested a variety of proposals that would do everything from banning algorithms that recommend content to banning young people from social media. Not only are these proposals impractical, but they will fail to address the issues that concern parents and lawmakers.

Let’s begin with the most radical proposal of banning young people from social media. This immediately begs the following question: what is social media? A recent California bill defines a social media platform as an online space where users can interact socially, create a profile, and create content that is viewable by others. It provides exemptions for services such as email, online video games, and companies that generate less than $100 million in revenue per year.  So, while services like Tik-Tok, Instagram, and Discord would be caught up in this legislation, Fortnite, message boards, and video streaming websites without profiles, where plenty of communication occurs, would not.

These arbitrary distinctions for social media platforms are unhelpful in addressing concerns about young people online. A recent survey from Common Sense Media found that video gaming is the second most popular media activity for tweens and teens online, behind watching TV or videos online. If the concern is that young people are online too much, ignoring these products seems unhelpful.

Likewise, simply banning some methods that young people use to communicate online will only lead to young people choosing other methods. Products to communicate online rise and fall faster than any legislature could regulate.

Less radical proposals, like the Minnesota law banning social media platforms from using algorithms to target content at users under 18 years old, also have their problems. Putting aside the practical failures of enforcement, as highlighted in the previous paragraphs, this proposal would do little to address concerns with online content and would just serve to make the platforms less family-friendly and less useful to users.

Algorithms are not some scary technology that need to be banned. In fact, they are what makes the modern internet work. When searching for content online, algorithms help people find the content they are looking for. This is how both general-purpose search engines like Google or Duck Duck Go work, as well as searches within websites or apps. This information also helps us find other content we might want to view online.

Without these tools, finding content you want to view online becomes impractical incredibly quickly. Algorithms also help young people view age-appropriate content. Those searching video platforms for their favorite children’s show are unlikely to get recommended a news story about the horrors of the war in Ukraine.

There is a reasonable concern that young people are spending too much time online rather than interacting with their friends and classmates in the real world. Indeed, such concerns date back generations from concerns about radio, comic books, and television.

No doubt these concerns were exacerbated by the COVID-19 pandemic as young people were forced to interact online as their schools closed down. However, these concerns should not be used as an excuse to enact unworkable and unhelpful policies.

Young people will need digital skills to compete in the marketplace once they enter the workforce and make decisions about the kind of content they want or need. Learning what kind of online content is worthwhile, whether it’s reading an article, watching a video, or listening to a podcast, is an important 21st-century skill.

The government shouldn’t be creating prohibitions or changing the functions of certain online platforms. Instead, technology must empower parents to ensure that young people interact with age-appropriate content for an agreed-upon amount of time. The same way that movie rating systems give parents knowledge about what is in a film, parents should get to see what apps children are using on their smart devices. Thankfully, such tools are becoming more widely available and easy to use.

Education and innovation are the way forward to ensure a safe experience for young people online, not unworkable government bans.

Want more? Get stories like this delivered straight to your inbox.

Thank you, we'll keep you informed!

By sharing your phone number and/or email address, you consent to receive emails, calls, and texts from Pelican Action. You may opt-out at any time.

Sign Up for Free Swag Delivered!

We're shipping Louisiana residents and supporters free swag! Simply fill out the form below.

close