From interacting with friends to sharing news in real time, social media has many benefits. But it also poses serious harms for young people, including learning disruption, mental health decline, addiction, bullying, sexual exploitation, disordered eating, and privacy concerns. We sat down with leaders of Design It For Us, a youth-led coalition advocating for safer social media platforms, to learn how we can protect young people’s privacy, safety, and well-being.
Arielle Geismar, co-chair of Design It For Us, is a digital wellness advocate and organizer in mental health, technology ethics, and LGBTQIA+ rights. She was formerly president of George Washington University’s Student Association and has traveled the country advocating for laws for young people. Zamaan Qureshi, co-chair of Design It For Us, is an activist and advocate for safer social media for young people with a focus on congressional policy reforms and advancing mental health care access. Frances Haugen, advisory board member of Design It For Us, is an advocate for social media accountability and a former Facebook product manager. Alarmed by Facebook’s prioritization of profit over public safety, in 2021 Frances blew the whistle and exposed the company’s harmful practices, testifying before Congress and sparking a global conversation about social media accountability.
–EDITORS
EDITORS: What problems do you see with social media for youth?
ARIELLE GEISMAR: In adolescence, young people enter a culture of comparison through social media. The amount of social data collecting that they have been taught to do on themselves and each other—and the way that they’re encouraged to see metrics such as likes, follower count, or engagement as reflections of their worth—has really impacted their growth and mental health. It’s very difficult for young people to know that social media is not real life when that’s all they’re exposed to.
I got Instagram for the first time when I was about 11, and it introduced me to a whole world of pro-anorexia and eating disorder content that I don’t know if I would’ve been familiar with otherwise. The more I spent time with that content, which I didn’t understand, the more of it I was shown. It started to make its way into my psyche and had a significant impact on me. This shows a much larger problem of social media companies that design products based on what makes them money. These metrics make people more addicted and more anxious. You’re comparing yourself more, so you want to spend more time on the app.
ZAMAAN QURESHI: I was 12 when I started using social media, and there were few to no guardrails and minimal understanding of the platforms we were using, who represented them, or what decisions were being made about the content we were being served or the design features we were using. It’s only been recently that many in my generation have been able to reflect critically on the harms of social media. I had a conversation with a friend about addictive algorithms and design features; initially, this person said they didn’t have a story, but later they shared, “I did experience an eating disorder, and actually, in retrospect, it was kind of driven by social media.”
That was really illuminating because it got me to consider how people think about the harms that they’ve experienced and what perpetuated them. I know young people who have experienced exploitation on social media. The number of them who have been sent unsolicited intimate images or been contacted by a stranger is quite stark. Yet many never considered this a “harm”—they just accepted these experiences as the status quo because they’ve grown up with social media, and it’s the only thing they know. To me, that’s scary.
ARIELLE: If kids were routinely exposed on a playground to the same things they are exposed to on their phones, there would be national outrage. Indecent exposure in the presence of a child at, say, a park is a crime. Yet it’s become the norm for a young person to receive unsolicited contact from a stranger on their phone through social media. It’s not OK.
FRANCES HAUGEN: Troublingly, these harms are increasingly affecting younger and younger children. Approximately a third of the 12- and 13-year-olds I’ve spoken with began using social media at ages 8 or 9, which is significantly earlier than many people realize. This alarming trend aligns with data showing one-third of kids ages 7–9 nationwide are using social media apps.1 While this undeniably alters their experience of middle school or high school, we don’t yet know the long-term consequences of young children actively trying to go viral on social media and seeking validation through social media metrics.
When I was growing up 30 years ago, most kids did not worry about their personal brands the way they do now. We should be questioning the ways that we’re implicitly changing kids’ childhoods by exposing them to social media at younger and younger ages and asking them to externalize their source of identity and self-worth.
Zamaan and Arielle both mentioned the risks and harm of intimate image exposure or misuse, which has become far more pervasive and is exacerbated by deepfakes and AI. These incidents don’t just harm the victim; they can even traumatize students in otherwise healthy relationships by eroding trust. The consequences can be far-reaching. Consider cancel culture, which floats the idea that young people can mess up at any moment, without any knowledge that they were being filmed, and the ramifications can follow them for years. When I talk to parents about these problems, some of them say, “Well, I’m fortunate that my kids are doing OK.” But with so many kids having access to smartphones and the pervasive nature of social media, can parents really guarantee the safety and well-being of their children with confidence?
ARIELLE: In the fall of 2024, Design It For Us did a campaign around explicit AI-generated content. These nonconsensual images depict someone’s likeness in a variety of scenarios, often sexually explicit or otherwise inappropriate, all doctored by AI. All a perpetrator needs to create these images or videos is someone’s face and the free technology that exists. It’s so important that we’re having this conversation because, as we get older, there are going to be so many more instances of nonconsensual explicit digital content sharing—deepfake or not. These have the potential to haunt young people beginning their professional careers. I don’t think our culture is prepared for the ramifications of what happens when we can’t discern the difference between an AI-generated image and a real image. Some think it’s relatively harmless because it’s just AI; however, studies show that these images have very similar impacts on victims.2
EDITORS: Why isn’t it reasonable to mitigate these harms by making parents and guardians responsible for their children’s social media usage or having young people put down their phones?
ARIELLE: One problem is that algorithms not only determine the content you see on social media but also push it to you, impacting how we view ourselves and each other. The algorithm that pushed pro-anorexia content to my Instagram feed didn’t care whether I was viewing it because I was engaged or because I didn’t like it and was trying to understand it. Algorithms are designed to make a profit by keeping users on the app (and therefore seeing advertisements) for as long as possible and driving more traffic to the platform, no matter whether the content is positive or damaging for the user. Even if children put their phones down, it matters what’s on their screens when they pick them back up.
It’s completely backwards that tech companies often try to shift responsibility for their harms onto parents and guardians. They are using parents as shields to avoid accountability for the problem they created. Who has more time, knowledge, and capacity to remodel a harmful product—the parent of a young user or the creators of the product themselves? Furthermore, which of the two should be responsible for a harmful consumer product? I’ll give you a hint: it’s not the parent.
ZAMAAN: Absent facing any real accountability, tech companies have offered largely ineffective or incomplete solutions. For instance, Meta knows that more than two million users are on its platform who don’t meet the minimum age requirement, but they don’t meaningfully enforce the restriction—as outlined in the 2023 complaint against Meta brought by a multistate coalition of attorneys general.3 And while Instagram has touted its parental control tools, less than 10 percent of parents use them because there are significant barriers.4
FRANCES: In addition, confiscating phones or simply telling young people to just “put them down” is an unrealistic and potentially harmful approach. Young people can get in a loop where they self-soothe on content that makes them anxious and more likely to need to self-soothe. These devices have such power over kids that confiscating them can be a trigger for suicide.* It’s ridiculous and unfair to blame parents or to assume that kids lack the self-control to limit their screen time. The only safe, reasonable approach is for companies to stop pushing harmful content to youth.
EDITORS: Considering all the harms, what benefits of social media do you see?
ZAMAAN: Although you can’t ignore the harms and complete lack of guardrails in this space, social media is not all bad. One of the great benefits is that I met Design It For Us co-founder Emma Lembke on Twitter, and that was an excellent place to connect around our shared interest in responsible technology. Many young people have experienced opportunities for connection and community around common interests through social media. It has created new ways of engaging and is one way that we do our activism even today.
ARIELLE: My background is in student social justice organizing. When I was in high school, I used social media to have my peers walk out of high school to protest inaction on gun violence prevention policies. I have also organized on the platform GenZ Girl Gang, a community of Gen Z women and femmes who are sharing professional information about how to show up in this world unapologetically. I’ve used social media for influencer campaigns on anti-vaping, climate change, and mental health. A lot of the reach that we’ve achieved as a coalition is because of social media. It’s one way that we are able to get young people to not only care and post about issues online but also convert online caring into in-person action. Social media has helped us advocate for legislation, create coalitions, and engage in meaningful conversations online.
FRANCES: Yes, there are certainly those benefits. And yet we need to always be taking a step back and asking what we are trying to accomplish by having any given form of social media. Is it that we want to make new friends? Discover or learn more about a topic? Have spaces for expression? There’s no intrinsic conflict for a lot of these needs.
If you design systems proactively on the idea that children are not just small adults, you can make spaces that fulfill those needs while respecting the dignity and autonomy of kids. The problem is that these platforms are only accountable to their shareholders (and in the case of Instagram, one shareholder). As long as we live in a system where there isn’t much self-regulation, we cannot expect these companies to voluntarily implement safer practices, even when they know how to do so.
EDITORS: What can we do to end the harms of social media and ensure it promotes youth well-being?
FRANCES: If we want to see a genuine course correction and meaningful change to social media, we need to raise awareness of just how serious and significant its detrimental impacts are on young people today. It’s easy to trivialize these concerns when you don’t have a teenager at home or don’t interact with young people regularly. I tell people that almost no one working at Facebook on the engineering, product, or design teams—the people who actually touch the product every day—has a child over the age of 11. Not even Mark Zuckerberg. So their children are not being impacted by its harms (yet). If they visited schools and talked to teachers, they would be shocked to learn that one of the top disciplinary issues is social media. And if more adults in general would talk to teenagers, they’d be shocked to learn how bad things are—and then they would act.
ZAMAAN: We also need to build awareness for young people who are impacted by these harms and may not even realize it. Young people haven’t been given the vocabulary to talk about these problems in ways that help us understand our experiences and move forward with solutions. Arielle and I often find that people don’t realize the harms until their late teenage or early adult years because they never had someone with an outside perspective explain it to them. I think that’s something the movement to end social media’s harms can and should address. The AFT could be a great partner in this by helping to equip education professionals to introduce this vocabulary earlier so that young people are more aware of the harms they could be experiencing and can talk about them productively, recognizing what they’ve experienced and offering solutions within their individual and/or collective contexts.
FRANCES: As a society, we rarely discuss the problems or harms of social media openly. Without a counternarrative, young people may not even recognize that they need help. If all your peers are putting up with it and not talking about it, what’s wrong with you that you can’t deal with the way things are? I think it’s damaging to localize responsibility for that trauma on the victim.
ARIELLE: Schools can be great places to start conversations about social media harms and get help, provided that they have resources in place to support students. The week I graduated high school, I lost one of my best friends to suicide. It’s powerful when teachers can reduce some of the shame or stigma of asking for help by having these conversations and making themselves available to us as a trusted person to talk to. They can also provide resources so students know where to go when they need support.† Working with groups like Design It For Us gives young people scalable options to get involved in issues they’re passionate about.
FRANCES: There are a lot of different ways to help young people talk about their experiences online, and storytelling is such a powerful tool for driving change. Our stories have consequences that ripple out into the world, and the advertisers, litigators, and state legislators who can put pressure on these companies need to understand why the need for action is so great.
ZAMAAN: At Design It For Us, we have a dedicated space where people can share their stories in the way that’s best for them.‡ And, as Arielle said, educators have tremendous opportunities to gain young people’s trust and let them know that it’s OK to share experiences—whether it’s a personal story or the experience of a family member or a friend—because stories drive this movement forward.
We are using these stories to fight for policy changes that will hold tech companies and social media platforms accountable. Our work shows young people that they can be the agents of change and push for things they believe in—and it shows that we have a lot of power when we work together. We helped pass the Maryland Age-Appropriate Design Code5 in 2024 because of the groundswell of support from parents, educators, and young people who showed up and told their stories to lawmakers again and again. We beat the tech lobby, which tried hard to water down the bill and stop its passage. Once that bill takes full effect in 2026, it will provide safeguards for all teenagers and young people across Maryland. We’re working to replicate that model in all 50 states and develop national standards that technology companies and social media platforms will be required to comply with.
FRANCES: To create meaningful change, school communities need to make intentional choices about the social media platforms they use, demanding transparency and accountability. One of the big things I’m pushing for is mandatory scorecards for social media platforms; they would have 100 core metrics covering the 20 worst problems with social media. Getting access to just 20 metrics—or even 10—about these platforms would allow people to make more informed choices about which platforms align with their values. We could know, for instance, which platforms allow kids to be active on them between the hours of 9 p.m. and 6 a.m. or what percentage of teenage girls received an unwanted sexual communication in the last seven days (on Instagram, that number is one in eight6). Tech companies and platforms could publish these data if they wanted to—and if we came together to demand it. This simple intervention would have a huge impact. It would influence how companies build their products, how advertisers spend their dollars, and how people choose to spend their time.
We also need to advocate for a set of consumer rights for our digital age. We have some sense of the rights we need: to control personal data, to know when your camera is turned on, and to influence or even reset your algorithms so that you don’t continue to see harmful content. I’ve talked to therapists who say they have kids who are trying to do the right thing to manage an eating disorder, but when they go on Instagram, that content follows them. Should those kids have to choose between the friends they’ve interacted with on the app and the memories they’ve posted, and an algorithm that wants to pull them further down into physical, mental, and emotional distress? Why should they have to choose when they could instead reset the algorithm and go back to a baseline of innocuous content?
Imagine a world where kids can be on social media and only can receive content that they explicitly ask for. When people aren’t presented with an alternative vision of what social media could be, they don’t view their current situation as unacceptable, and they don’t demand other options.
Another strategy to try is warning labels, which could be quite effective in changing the societal perception that social media is innocuous. They can also help change the conversation for parents about what is behind the significant mental health declines and other impacts of social media. A lot of parents are frustrated. They know they’re seeing changes in their kids’ well-being, but they don’t know who or what is to blame. Warning labels can be effective in communicating that these devices and platforms are meaningfully dangerous.
ZAMAAN: Accountability is also crucial. Instagram recently announced it was rolling out a feature to turn off notifications at night for young people.7 So if they’re going to propose these changes—and they’ve proposed similar changes in the past—we need to be able to see that they’re actually implementing them, and we need access to the data that show whether these changes truly improve the lives of young people.
ARIELLE: The good news is that young people are leading the way on these and other solutions. Historically, young people have been champions of social justice issues and have made incredible progress. Design It For Us is one example of that, but there are so many young people who are passionate. I encourage young people to get involved. I also encourage parents, guardians, and teachers to empower young people to think critically about these issues and to uplift their voices when young people come to them not only with problems, but also with solutions.
ZAMAAN: I think there’s a tendency to look at this issue and think about how bleak it is because of the iron grip these powerful companies have maintained on their status quo. But people have tremendous power when they organize, mobilize, and speak out about what they believe in. That’s how we’ve been able to make inroads in a short amount of time. Last year was the onset of state policy in this area. And more and more people are understanding the harms and deciding to do something about it, so we’re having more conversations about safer social media. We’ve come so far already, and together I think we will continue to see progress.
*If you or anyone you know needs support, call or text 988 for the Suicide and Crisis Lifeline. (return to article)
†Again, those in crisis can call or text 988 for the Suicide and Crisis Lifeline. Those trying to help others can also find guidance and support at 988lifeline.org/help-someone-else. (return to article)
‡To share a story of an experience with social media through Design It For Us, visit designitforus.org/stories. (return to article)
Endnotes
1. C.S. Mott Children’s Hospital, University of Michigan Health, “Mott Poll Report: Sharing Too Soon? Children and Social Media Apps,” National Poll on Children’s Health 39, no. 4 (October 18, 2021), mottpoll.org/reports/sharing-too-soon-children-and-social-media-apps.
2. P. Verma, “AI Fake Nudes Are Booming. It’s Ruining Real Teens’ Lives,” Washington Post, November 5, 2023, washingtonpost.com/technology/2023/11/05/ai-deepfake-porn-teens-women-impact; S. Maddocks, “Image-Based Abuse: A Threat to Privacy, Safety, and Speech,” MediaWell, March 15, 2023, mediawell.ssrc.org/research-reviews/image-based-abuse-a-threat-to-privacy-safety-and-speech; and R. Umbach et al., “Non-Consensual Synthetic Intimate Imagery: Prevalence, Attitudes, and Knowledge in 10 Countries,” Proceedings of the CHI Conference on Human Factors in Computing Systems, Honolulu, HI, May 2024, dl.acm.org/doi/fullHtml/10.1145/3613904.3642382.
3. People of the State of California v. Meta Platforms Inc., US District Court for the Northern District of California, October 24, 2023, ag.ny.gov/sites/default/files/court-filings/meta-multistate-complaint.pdf.
4. N. Nix, “Meta Says Its Parental Controls Protect Kids. But Hardly Anyone Uses Them,” Washington Post, January 30, 2024, washingtonpost.com/technology/2024/01/30/parental-controls-tiktok-instagram-use.
5. Maryland House Bill 603, Ch. 461 (2024), mgaleg.maryland.gov/2024RS/Chapters_noln/CH_461_hb0603t.pdf.
6. K. Chan, “Instagram Begins Blurring Nudity in Messages to Protect Teens and Fight Sexual Extortion,” Associated Press, April 11, 2024, apnews.com/article/instagram-meta-nudity-sexual-extortion-social-7bea9b1244ea023fb85265672bcd6560.
7. M. Isaac and N. Singer, “Instagram, Facing Pressure Over Child Safety Online, Unveils Sweeping Changes,” New York Times,September 17, 2024, nytimes.com/2024/09/17/technology/instagram-teens-safety-privacy-changes.html.
[Illustrations by Stephanie Shafer]