A Parent’s Call for Accountability: The Push for Stronger AI and Social Media Regulations
As artificial intelligence (AI) and social media platforms become ever more ingrained in daily life, a growing chorus of voices is demanding urgent action to protect vulnerable populations—particularly children and teenagers—from the potentially devastating effects of unregulated online content. At the forefront of this advocacy is Jennie Deserio, a founding member of Parents for Safe Online Spaces (Parents SOS), whose personal tragedy has catalyzed a movement for stronger AI and social media safeguards. Her story, recently featured on the Fox News Rundown Extra, highlights not only the emotional toll of social media but also the complex, ongoing policy struggle between state and federal authorities.
The Policy Landscape: State vs. Federal Regulation
The legislative battle over AI regulation intensified during the recent congressional debate on President Trump’s expansive technology and commerce bill, colloquially dubbed the “Big Beautiful Bill.” The House version of the bill included a provision that would have placed a moratorium on state-level AI lawmaking, thereby consolidating regulatory power at the federal level. However, a Senate amendment ultimately stripped this section, allowing states to continue enacting and enforcing their own rules concerning AI and content moderation.
The removal of the federal moratorium was met with relief by advocacy groups like Parents SOS, who argue that state-led initiatives are crucial for experimenting with and refining online safety strategies—especially when federal action lags. According to data from the National Conference of State Legislatures, as of mid-2024, at least 35 U.S. states have introduced or passed bills addressing AI and digital safety issues, including age-appropriate design codes, increased accountability for tech companies, and greater digital privacy protections.
A Parent’s Story: From Grief to Advocacy
Jennie Deserio’s advocacy is deeply personal. In a moving interview with Lisa Brady of Fox News Rundown, Deserio recounts the loss of her 16-year-old son, Mason, to suicide—an act she attributes in part to the toxic and harmful content he encountered on social media platforms powered by AI-driven algorithms. “These algorithms are designed to keep kids engaged, but too often that means bombarding them with content that is far from safe,” Deserio explains.
Multiple studies echo Deserio’s concerns. According to a 2023 Centers for Disease Control and Prevention (CDC) report, suicide remains the second leading cause of death among Americans aged 10-24, and mental health experts warn that algorithmically curated online content can exacerbate feelings of isolation, anxiety, and depression in teens. A recent Pew Research Center study revealed that nearly 70% of U.S. teens said social media platforms make them feel anxious or stressed.
The Growing Movement for Reform
Frustrated by what many perceive as congressional inaction, Parents SOS and similar organizations are doubling down on efforts to drive change at the state level. Several high-profile states, including California, Utah, and New Jersey, have recently enacted laws strengthening online safety for minors. California’s Age-Appropriate Design Code Act, for instance, sets stricter privacy and safety requirements for companies offering online services likely to be accessed by children.
Meanwhile, in Washington, bipartisan pressure is mounting. Lawmakers such as Senator Richard Blumenthal (D-CT) and Senator Josh Hawley (R-MO) have convened hearings and proposed bills that would require social media companies to curb addictive product features, increase transparency into their recommendation algorithms, and create clearer channels for parental oversight. However, technology lobbyists and some federal agencies warn of the risks of a fragmented patchwork of regulations that might stifle innovation or inadvertently create loopholes.
“If Congress won’t act, states will—and should,” says Deserio, emphasizing that children’s safety cannot wait while legislative debates drag on. Advocacy remains focused on pressuring both statehouses and Congress to institute national minimum standards, while allowing states the flexibility to go further if desired.
Tech Industry Response and Accountability
Facing increased scrutiny, major social media companies, including Meta (Facebook, Instagram), Snap, and TikTok, have announced new measures to protect minors—such as defaulting accounts for users under 18 to private, restricting direct messaging, and introducing AI filters for potentially harmful content. But critics argue these changes are often cosmetic and do not address fundamental algorithmic incentives that continue to prioritize engagement over safety.
AI developers and platforms are also coming under pressure to conduct and publish impact assessments on their products, particularly regarding how recommendation engines affect mental health. In Europe, the EU’s Digital Services Act, which went into effect in 2024, is seen as a possible model, requiring large tech companies to identify and mitigate systemic risks to users—including children.
A Call for Comprehensive Action
As digital technology continues to evolve, the case of Mason Deserio and many other young lives underscores the urgent need for robust regulation and oversight. Parents, educators, and policymakers increasingly call on Congress to enact comprehensive laws that balance innovation with ethical responsibility, ensuring AI and social media platforms serve the public good first and foremost.
Until national legislation is passed, expect more state-led initiatives and a patchwork of approaches as families demand that the tech industry and policymakers take greater responsibility for protecting children online.
“It’s about accountability and prevention,” Deserio insists. “No parent should have to experience what we’ve gone through—Congress, the states, and Silicon Valley all have a part to play.”
The future of AI and online safety will likely depend on ongoing public pressure and the courage of those willing to turn personal loss into determined advocacy.

