Politics

/

ArcaMax

Commentary: Amid Trump's war on LGBTQ+ teens, social media platforms must step up

Claudia-Santi F. Fernandes, The Fulcrum on

Published in Op Eds

With Trump’s war on inclusion, life has suddenly become even more dangerous for LGBTQ youth. The CDC has removed health information for LGBTQ+ people from its website—including information about creating safe, supportive spaces. Meanwhile, Trump’s executive order, couched in hateful and inaccurate language, has stopped gender-affirming care.

Sadly, Meta’s decision in January to end fact-checking threatens to make social media even less safe for vulnerable teens. To stop the spread of misinformation, Meta and other social media platforms must commit to protecting young users.

Just a few months ago, Meta appeared to be taking a step in the right direction, launching its Teen Accounts with promises of safer online spaces. But the company’s recent decision to end fact-checking on its platforms threatens to undo all that progress—especially for teens who are already vulnerable. Among the most at risk are LGBTQ+ young people, whose safety and well-being are further endangered when harmful misinformation goes unchecked.

Adolescence is a time of self-discovery, and for many young people, that means exploring questions about their sexual identity. Imagine a teen scrolling through their social media feed—curious to learn more about interpersonal relationships and sexual identity—searching the internet to answer any questions that they may have in a place that they perceive as safer than their home or school. But that space is anything but safe now when untrue statements like “LGBTQ+ is a mental illness” spread unchecked.

These scientifically debunked statements aren’t just factual errors easily correctible by other online users—they are direct assaults on teens’ sense of self, as well as their mental health and well-being. Studies show that victimization, including anti-LGBTQ+ harassment, strongly predicts self-harm and suicidal thoughts and behaviors among LGBTQ+ young people. Young people may internalize these harmful ideas, leading to confusion, shame, or even mental health struggles like anxiety, depression, or suicide ideation. This false narrative not only stigmatizes LGBTQ+ young people and impacts their mental health but also creates an environment where young people may feel compelled to hide their identities or potentially seek harmful treatments unsupported by evidence. Adults, including those who run tech companies, are responsible for creating safe and positive online experiences for young people.

We already have experts working on this issue, too. For example, the American Academy of Pediatrics—our country’s leading group of children’s doctors—studies healthy social media use through its Center of Excellence on Social Media and Youth Mental Health. Its co-directors, Dr. Megan Moreno and Dr. Jenny Radesky, specifically recommend platform policies that prevent the spread of untrustworthy and hateful content and more user control over settings, which are often buried.

At first, Meta seemed to be listening, instituting Teen Accounts with built-in features such as a sleep mode and limits on sensitive content. Even better, they planned to improve these features and include young people in the process. However, removing fact-checking on their platform undermines these efforts, increasing teens’ exposure to inaccurate, misleading, and/or harmful information. This contradiction sends a troubling message: while Meta claims to prioritize the safety and well-being of young users, it simultaneously dismantles one of the key mechanisms ensuring information integrity.

To be sure, Mark Zuckerberg framed his decision as a defense of “free expression” and a move away from “too much censorship.” On the surface, this sounds like something teens would wholeheartedly embrace. But in fact, the elimination of fact-checking, and the dismantling of safeguards for young users directly contradict what teens themselves deserve and desire. Young people, among the most active users of social media, consistently express a desire for safer online spaces. According to the Pew Research Center, the majority of teens prioritize feeling safe over being able to speak their minds freely; they also want enhanced safety features and content moderation. Both freedom of expression and enhanced safety features are crucial, but ensuring a safe and supportive online environment is essential to protecting teens’ well-being while fostering open dialogue.

 

When even teens call for more safeguards, adults—including those who run social media companies—have a moral obligation to respond. If Zuckerberg decides to scrap safeguards in fact-checking in favor of “Community Notes,” we must ensure that “Community Notes” strategies are evidence-based, expert-informed, youth-centered, and community-driven. According to research, social media companies must prioritize the following three approaches to ensure young people’s safety online:

Partnering with LGBTQ+ and other advocacy groups from marginalized communities to ensure that information shared is truthful, accurate, and rooted in the lived experiences of marginalized communities. For example, GLAAD recently released a report detailing harmful content on Meta’s platform, including the use of violent language toward LGBTQ+ individuals and the use of severe anti-trans slurs, among many others. This report prompted them to pen a letter with specific calls to action on addressing misinformation. The recommendations are there. Work with them.

Investing in youth-centered approaches. As an example, researchers at the MIT Media Lab launched Scratch (i.e., an online community for children that teaches them coding and computer science) in 2007. They implemented a governance strategy to moderate content proactively and reactively. Through youth-centered Community Guidelines and adult moderator s, they address hate speech and remove it immediately. Appropriately trained moderators serve as essential gatekeepers, ensuring that platforms remain spaces for healthy dialogue rather than havens for toxicity for young people.

Linking young people to evidence-based, culturally informed mental health resources at every opportunity. Young people are eager for online support (e.g., online therapy, apps, and social media) to manage their mental health, and they deserve access to accurate, safe, and affirming information—free from misinformation, exploitation, and harmful bias. Ensuring LGBTQ+ young people have access to mental health resources, especially to intervene early, is critical.

Zuckerberg framed the end of fact-checking as protecting free speech. Instead, he’s protecting hate speech and misinformation at the cost of young people’s wellbeing—the very thing Teen Accounts were meant to safeguard. If Zuckerberg is sincere about improving Meta’s products for young people, then Teen Accounts must be accountable—to the truth.

____

Claudia-Santi F. Fernandes, Ed.D., is an assistant clinical professor at the Yale Child Study Center. She is a public voices fellow of The OpEd Project.


©2025 The Fulcrum. Visit at thefulcrum.us. Distributed by Tribune Content Agency, LLC.

 

Comments

blog comments powered by Disqus

 

Related Channels

ACLU

ACLU

By The ACLU
Amy Goodman

Amy Goodman

By Amy Goodman
Armstrong Williams

Armstrong Williams

By Armstrong Williams
Austin Bay

Austin Bay

By Austin Bay
Ben Shapiro

Ben Shapiro

By Ben Shapiro
Betsy McCaughey

Betsy McCaughey

By Betsy McCaughey
Bill Press

Bill Press

By Bill Press
Bonnie Jean Feldkamp

Bonnie Jean Feldkamp

By Bonnie Jean Feldkamp
Cal Thomas

Cal Thomas

By Cal Thomas
Christine Flowers

Christine Flowers

By Christine Flowers
Clarence Page

Clarence Page

By Clarence Page
Danny Tyree

Danny Tyree

By Danny Tyree
David Harsanyi

David Harsanyi

By David Harsanyi
Debra Saunders

Debra Saunders

By Debra Saunders
Dennis Prager

Dennis Prager

By Dennis Prager
Dick Polman

Dick Polman

By Dick Polman
Erick Erickson

Erick Erickson

By Erick Erickson
Froma Harrop

Froma Harrop

By Froma Harrop
Jacob Sullum

Jacob Sullum

By Jacob Sullum
Jamie Stiehm

Jamie Stiehm

By Jamie Stiehm
Jeff Robbins

Jeff Robbins

By Jeff Robbins
Jessica Johnson

Jessica Johnson

By Jessica Johnson
Jim Hightower

Jim Hightower

By Jim Hightower
Joe Conason

Joe Conason

By Joe Conason
Joe Guzzardi

Joe Guzzardi

By Joe Guzzardi
John Micek

John Micek

By John Micek
John Stossel

John Stossel

By John Stossel
Josh Hammer

Josh Hammer

By Josh Hammer
Judge Andrew Napolitano

Judge Andrew Napolitano

By Judge Andrew P. Napolitano
Laura Hollis

Laura Hollis

By Laura Hollis
Marc Munroe Dion

Marc Munroe Dion

By Marc Munroe Dion
Michael Barone

Michael Barone

By Michael Barone
Michael Reagan

Michael Reagan

By Michael Reagan
Mona Charen

Mona Charen

By Mona Charen
Oliver North and David L. Goetsch

Oliver North and David L. Goetsch

By Oliver North and David L. Goetsch
R. Emmett Tyrrell

R. Emmett Tyrrell

By R. Emmett Tyrrell
Rachel Marsden

Rachel Marsden

By Rachel Marsden
Rich Lowry

Rich Lowry

By Rich Lowry
Robert B. Reich

Robert B. Reich

By Robert B. Reich
Ruben Navarrett Jr

Ruben Navarrett Jr

By Ruben Navarrett Jr.
Ruth Marcus

Ruth Marcus

By Ruth Marcus
S.E. Cupp

S.E. Cupp

By S.E. Cupp
Salena Zito

Salena Zito

By Salena Zito
Star Parker

Star Parker

By Star Parker
Stephen Moore

Stephen Moore

By Stephen Moore
Susan Estrich

Susan Estrich

By Susan Estrich
Ted Rall

Ted Rall

By Ted Rall
Terence P. Jeffrey

Terence P. Jeffrey

By Terence P. Jeffrey
Tim Graham

Tim Graham

By Tim Graham
Tom Purcell

Tom Purcell

By Tom Purcell
Veronique de Rugy

Veronique de Rugy

By Veronique de Rugy
Victor Joecks

Victor Joecks

By Victor Joecks
Wayne Allyn Root

Wayne Allyn Root

By Wayne Allyn Root

Comics

Lee Judge Darrin Bell Andy Marlette Bob Englehart Jimmy Margulies Ed Wexler