State Laws, Social Media Bans, and Youth: What Are We Doing?

social-media-ban-state-law-children

There has been a flurry of activity related to new legislation intending to make social media and gaming platforms safer and more accountable to upholding expected standards of trust, security, transparency, and privacy. These laws are being proposed because of continued concern of possible ill effects of popular platforms on the well-being of young people. While an objective look at the research base provides a complex picture of mixed findings related to the positives and negatives of social media use, many legislators realize that this is a topic of great concern to families across the United States and accordingly want to do something about it.

In late May 2023, U.S. Surgeon General Dr. Vivek Murthy issued an urgent call for action by all stakeholders to deeply understand the impact of social media on youth mental health.  I think that is incredibly necessary because the possibilities of harm are myriad, and it is likely that they have a compounding effect. And, that is our wheelhouse – exactly where we do most of our research, advocacy, and training in equipping schools, NGOs, and corporations to protect minors and build healthy online communities.

However, when I survey the landscape of the laws that are being proposed or passed around the nation, I am concerned that deep understanding has not taken place. I am concerned that politicians are not interfacing with online safety experts from a multitude of disciplines to gain a nuanced picture of the issues at hand. I am concerned that an antagonistic approach towards platforms will cause progress to sputter, and that what is needed is a cooperative partnership where goals can be achieved in as mutual a manner as possible. I am concerned that most lawmakers have a very shallow and incomplete appreciation not only of what the researchbase says (even Dr. Murthy acknowledges much uncertainty in the extant research findings), but also of the feasibility of what they suggest platforms should do.

I am concerned that politicians are not interfacing with online safety experts from a multitude of disciplines to gain a nuanced understanding of the issues at hand. I am concerned that an antagonistic approach towards platforms will cause progress to sputter, and that what is needed is a cooperative partnership where goals can be achieved in as mutual a manner as possible.

Let’s talk about some state legislation which is built upon the cornerstone of age restrictions.  For example, Utah would make social media platforms off-limits to children ages 15 and younger. Similarly, a bill introduced in Texas would ban anyone under the age of 18 from using social media. In Louisiana, those under 18 apparently cannot have access to any sort of social media platform without express parental approval. How exactly is this going to happen? How will this be enforced on a practical level? Shouldn’t there be conversation about the rights of a young person to exercise their freedom of speech and expression online? Is it possible that depriving them of access is a human rights violation, as has been articulated by a prestigious international committee organized by the United Nations?

Without collecting a great deal of personally identifiable information that is ripe for exploitation, I don’t understand how these limites will be enforced. Currently, the major platforms rely on the honor system and trust that the age a user inputs upon signup is truly their age. Even if each started to require photo or video selfies and/or the uploading of a government ID or to cooperation with a commercial age verification system, there exist methods to circumvent or bypass the gateway. Indeed, industry for years has tried to figure out age-verification solutions with the least amount of friction and the most amount of user-friendliness to actually catch on and not be a deterrent to use. Biometric factors such as using specific regions of human speech bandwidth can be easily circumvented using a recording, while face recognition requirements can be bypassed by using a photo of an adult. Fingerprint or iris verification requires specialized hardware. The privacy concerns associated with the collection of all of these biometric markers are also significant.

Additionally, some laws (like in Arkansas) do not apply equally to all platforms. Much of this seems arbitrary or even sinister. For instance, companies that are mostly about gaming are exempt (those that “exclusively offers interacting gaming, virtual gaming, or an online service, that allows the creation and uploading of content for the purpose of interacting gaming”). So are those that make less than 25% of their revenue from social media, and those that provide cloud storage (What? Why? So random). It is curious that a co-sponsor of this bill specifically stated that the goal of the legislation is to  “to empower parents and protect kids from social media platforms, like Facebook, Instagram, TikTok, and Snapchat.” What is curious is that an amendment to the law was filed recently that excludes any “social media company that allows a user to generate short video clips of dancing, voice overs, or other acts of entertainment.” Wait a second: Facebook, Instagram, TikTok, and Snapchat each allow and encourage that exact type of content to be created. This doesn’t make sense. What – and whose – interests are being served here?

Another law from Utah seeks to prohibit a social media company from using a design or feature that “causes a minor to have an addiction to the company’s social media platform.” What does this even mean? How do we define “addiction”? My four-year old keeps coming back to his Legos. Does not every toy manufacturer design products that induce a pleasurable neurobiological reaction in a child’s brain? What is the role of the user – even if they are a teenager – in developing personal agency and self-control, and in taking advantage of the screentime restrictions available within the app or even on their device to help facilitate self-control? What is the role of parents and guardians in meaningfully shepherding and restricting (over)use much as they would restrict anything else? Should we then not also ban Netflix from contributing to binge-watching? Lay’s Potato Chips from betting that we cannot eat just one? Nike for making so many new versions of Air Jordans for the sneakerheads among us? What is the culpability of hardware manufacturers who sell wearables that keep us tethered to technology? Why aren’t these questions being asked when something as major as a law that affects tens of millions of people is being proposed?

What is the role of the user – even if they are a teenager – in developing personal agency and self-control, and in taking advantage of the screentime restrictions available within the app or even on their device to help facilitate self-control? What is the role of parents and guardians in meaningfully shepherding and restricting (over)use much as they would restrict anything else?

Legislation from Utah also attempts to impose a social media curfew that blocks online access of children from 10:30pm to 6:30am unless their parents adjust that range. Why do we need legislation for this? Can we not just ask parents to be parents? This is also not enforceable because of how easy it is to use proxies and VPNs, switch time zones within devices, and use services for network provision outside of those monitored. Furthermore, parents have a hard time even using the safety features and controls that device manufacturers and platforms already provide to them to safeguard their children, and now we are asking them to do yet one more thing? They are going to be tired of hearing their teen tell them he is not yet done with his homework at 10:30pm, and just be done with this restriction.

While I know this is not patently true, it seems like many legislators got together one morning with coffee and donuts, rallied around some alarmist sentiments they heard somewhere, engaged in a good amount of gnashing of teeth and pearl clutching, decided they must demonstrate action to remain relevant to their constituency, and came up with some feel-good, one-size-fits-all solutions before the breakfast food ran out. Unfortunately, a careful understanding of the complex issues at hand – and the feasibility of application and enforcement of their proposals – remains glaringly missing.

One of my biggest concerns is as follows: What is done with the identification data once it is used to verify identity? The dustbin of history is littered with examples of privacy violations where major platforms have mishandled personal data that they have been entrusted with. This is also to say nothing about violations by third-party entities, nor by intentional hacks or other forms of inappropriate data tracking and harvesting. Some of these laws attempt to punish companies that collect information from their users that does not pertain to age verification of the account. Sanctions range from $100 per violation in Wisconsin to $5,000 per violation in Utah. How is this going to be proven and enforced? Oh, by the way, Utah’s bills also give parents full access to their children’s online accounts – including their private messages. If the goal is healthier families, better parent-child relationships, and thriving teenagers, I’m not sure we’ve put sufficient thought into this idea.

At this point, I’d like to draw your attention to a neat resource over at Tech Policy Press created by Tim Bernard. He created a spreadsheet in the Spring of 2023 that lists 144 different bills introduced across 43 states focused on protecting children from Internet-based harms. Many (but not all) seem like knee-jerk responses based on a poor understanding of numerous interrelated factors that must be considered when identifying and proposing solutions to the problems at hand. Of course, I am in favor of those that call for increased education as part of the curriculum, or which require anonymous reporting systems, or which champion the importance of building positive school climates and cultivating soft skills (like social and emotional learning approaches, restorative practices, digital citizenship, and media literacy).

Is this legislation being proposed because we are not willing to look at ourselves in the mirror when it comes to the behaviors we ourselves model, the social environments we create offline, the level and quality of involvement we have in the lives of youth, and the amount of effort it truly takes to support a kid these days?

As I close, what is the point with all of this legislation? Is it to encourage further conversation and partnerships among the major stakeholders to put their heads down and develop responsibilities and strategies for the various sectors they represent (including at home and in schools)? Is it lip service to tickle the ears of a morally panicked citizen base whose primary perspectives about most issues are sourced from sensationalistic news stories and clickbait articles? Is it to bring the hammer down on the 800 pound gorillas of industry because they are easy to scapegoat, and everyone can agree they should do “more” but can’t present realistic solutions? Is it because we are not willing to look at ourselves in the mirror when it comes to the behaviors we ourselves model, the social environments we create offline, the level and quality of involvement we have in the lives of our youth, and the amount of effort it truly takes to support a kid these days? Finally, can anyone point to any research that demonstrates that these types of laws actually make a measurable difference in enhancing youth safety and well-being? Anywhere? In the world? Or are we just throwing gummy worms at the wall, hoping that at least one of them will stick?

Let me be clear: Platforms have to do more. Much more. We continue to pound the table for novel policies, programming, in-app safety features, educational initiatives, messaging campaigns, content moderation methods, and reporting protocols from social media and gaming companies. What is more, they are specifically requesting, heeding, and implementing some of our data-driven insights. But most of these pieces of governmental legislation are not helpful.  Indeed, very, very few are passing and becoming formally codified. Why is that? Largely, I think it is because many lack thoughtfulness and creativity, are lazily constructed (and involve a lot of copying and pasting of language from other proposed laws), and are not based on clear, consistent findings from empirical research. Many legislators are sadly wasting everyone’s time, the government’s resources, and our tax dollars. But most critically, they are failing to truly and meaningfully help the situation as our nation’s youth continue to struggle.

Featured Image: https://tinyurl.com/3va2dk9m

Leave a Reply

Your email address will not be published. Required fields are marked *