In an increasingly rare moment of bipartisan agreement, a coalition of 62 senators and 280 House members have co-sponsored the Children's Online Safety and Privacy Act, legislation that would impose the strictest regulations ever on how social media platforms interact with users under 16. The bill reflects growing alarm among parents, educators, and health professionals about the impact of social media on children's mental health.
The legislation requires social media companies to implement robust age verification systems, disable algorithmic recommendation engines for minor users by default, prohibit targeted advertising to children, provide parents with comprehensive dashboard controls, and conduct annual independent audits of their platforms' impact on child welfare. Platforms that fail to comply face fines of up to $50,000 per violation per affected child.
Mental Health Crisis
The bill was galvanized by a series of congressional hearings featuring testimony from parents, mental health researchers, and former social media executives. The Surgeon General's advisory on social media and youth mental health, which characterized the current situation as a "public health emergency," provided scientific backing for legislative action.
"When the Surgeon General tells us that social media is driving an unprecedented mental health crisis among our children, and when tech executives admit their own children aren't allowed on these platforms, the case for action is irrefutable," said the bill's lead sponsor.
Technology companies have pushed back against several provisions, arguing that age verification requirements could compromise adult users' privacy and that disabling recommendation algorithms would fundamentally alter the user experience. Meta and TikTok have proposed alternative industry self-regulation frameworks, but lawmakers have dismissed these as insufficient. The bill is expected to receive a floor vote before the end of the session, with overwhelming passage considered likely in both chambers.