Today’s children are fully the generation of the digital age. Kids and teens do a lot online, from completing homework to playing games to socializing with friends. Sites and apps have become a big part of kids’ lives. What’s missing, however, are rules ensuring the web is a safe and healthy place for our youngest citizens.
California currently has the chance to make a real impact on kids’ experience of the web. By passing the Age Appropriate Design Code Act (Asm. Buffy Wicks), California can require tech companies to think about the impact their online platforms will have on kids before kids can log on, and design their sites and apps with kids in mind.
Why we need the Age Appropriate Design Code Act
We have a lot of rules we’ve written and laws we’ve passed to keep kids safe in the offline world, like protection from health threats caused by lead paint and unsafe toys, or enforcing safety standards for things like car seats. Yet despite the rising prominence of the web in kids’ lives for both education and recreation, we’ve failed for 20 years to update rules to protect them in the digital spaces they inhabit.
The Internet has done a lot of good for the world and the lives of kids, but it hasn’t been without its harms – and sometimes very serious ones.
Many online platforms are designed to induce kids to log on more often and spend more time online. Features like autoplay and notifications to check platforms can contribute to longer time on screens and culminate in physical risks like eye problems or sleeping less during developmental periods when getting enough sleep is crucial.
Other online platforms have default settings that make kids share more information and data about themselves than is necessary for the service they’re getting. For instance, some share kids’ geolocation by default – information that can be dangerous in the wrong hands.
All of the above features – those that keep users on platforms longer, and nudge them to share more data – exist to support the Internet’s primary business model: hold our attention as long as possible, extract as much data about us as possible, sell it to as many companies as possible, and show us as many targeted ads as possible.
For kids, the more time spent online means more exposure to ads, and advertising itself can be harmful. Young children take ad messages to be the unbiased truth, as they have yet to develop the critical thinking skills to understand advertising doesn’t discuss the downsides of products. This means ads can shape a child’s current desires and their future behavior. For instance, research has tied the targeting of kids with ads for junk food to the rising obesity crisis. Bombarding children with ads also can instill early on that happiness and fulfillment is something only obtained by material consumption, restricting kids’ emotional development.
Even apps designed explicitly for children are often configured to collect personal data sold to companies and used to create highly targeted ads. A study from earlier this year found that almost 90% of online educational tools are designed to collect and send information to advertising technology companies that could predict students’ interests and what they might want to buy.
The Internet’s business model and the tools it uses to shape our and our children’s behavior culminates in mental health difficulties, too. Rising anxiety and depression has been linked to social media use. Leaked reports of Facebook’s internal research on its app Instagram concluded it was making body image issues worse for 1 in 3 teen girls. In Britain, 13% of teenage girls struggling with mental health drew a direct line between Instagram and a desire to kill themselves.
We need to take these problems seriously and address them head on.
How the Age Appropriate Design Code Act would make the Internet a better place for kids
If passed, the bill would require companies to consider the impact online platforms may have on kids before rolling them out to the public. It would rein in the use of features that facilitate device addiction, and require privacy settings to be strong by default. The bill also includes a data minimization standard – prohibiting the collection, sale, retention or use of any personal information beyond what is needed to deliver the product or service kids and their parents are expecting to get.
Kids deserve to be safe, healthy and supported both off and online. It’s beyond time to put the health and wellbeing of our children at the center of how the Internet is shaped.
Director, Don't Sell My Data Campaign, PIRG; Policy Analyst, Frontier Group
R.J. focuses on data privacy issues and the commercialization of personal data in the digital age. Her work ranges from consumer harms like scams and data breaches, to manipulative targeted advertising, to keeping kids safe online. In her work at Frontier Group, she has authored research reports on government transparency, predatory auto lending and consumer debt. Her work has appeared in WIRED magazine, CBS Mornings and USA Today, among other outlets. When she’s not protecting the public interest, she is an avid reader, fiction writer and birder.
State Director, CALPIRG
Jenn directs CALPIRG’s advocacy efforts, and is a leading voice in Sacramento and across the state on protecting public health, consumer protections and defending our democracy. Jenn has served on the CALPIRG board for the past two years before stepping into her current role. Most recently, as the deputy national director for the Student PIRGs, she helped run our national effort to mobilize hundreds of thousands of students to vote. She led CALPIRG’s organizing team for years and managed our citizen outreach offices across the state, running campaigns to ban single-use plastic bags, stop the overuse of antibiotics, and go 100% renewable energy. Jenn lives in Los Angeles, where she enjoys spending time at the beach and visiting the many amazing restaurants in her city.