A City Council member is proposing legislation that would force social media giants to implement a one-hour time limit on kids’ access to their accounts, unless they are explicitly allowed by a parent or guardian.
Althea Stevens, who represents part of the Bronx and chairs the committee on children and youth, said she’s responding to growing concerns and research about the mental health effects of social media on kids.
“We really have to start looking at what it looks like to regulate [social media] and make sure that we have proper studies to go over how it’s affecting our kids,” said Stephens, who sponsored the bill.
The legislation would apply to children and teens under 17 years old. The bill would also prohibit social media companies from targeting ads to youth.
The proposed legislation joins a wave of efforts to regulate tech companies and the way young people interact with their products. New York Governor Kathy Hochul recently required warning labels about social media for kids and banned smartphones in schools, saying they are fueling a mental health crisis among youth and have led to a crisis of distraction.
A handful of states have gone further, either banning kids from having social media accounts outright or requiring more parental oversight. In 2024, New York City joined other school districts in a lawsuit against social media companies, saying the platforms are fueling a mental health crisis among youth.
Courts are evaluating the legality of these government-imposed limits. Tech companies have fought back against the regulations, citing existing protections and threats to free speech. The U.S. Supreme Court has ruled unfavorably on some efforts by governments to restrict or remove certain content, even if it is considered false or misleading.
Globally, Australia and Indonesia banned kids under 16 from making or keeping social media accounts.
Justin Harrison, senior policy counsel at the New York Civil Liberties Union, said the City Council bill would be both unconstitutional and hard to enforce. He said cutting off kids’ access to social media after an hour would “restrict far too much protected speech.”
“There is certainly controversial or offensive speech on social media, certainly things that we might not want our children to see, but there’s also a lot of useful content on social media,” he said.
Harrison said if the bill becomes law, adults in the city would also likely have to prove their age to avoid the one-hour limit, which could create more unconstitutional obstacles to speech, especially for vulnerable groups that may not have easy access to identifying documents.
“They may lose access to social media after an hour, even though they have a fully protected constitutional right to receive it,” he said.
Stevens also recently introduced a separate bill that would require city agencies to study the effects of social media on young people and make recommendations for limits on age and hours of usage.
The council will review the bills as part of a hearing on social media and youth on April 21.
The hearing comes after a California jury found social media companies harmed a child with addictive feeds in a landmark case experts say may mirror the fights against Big Tobacco decades ago. The companies, Meta and Google, have said they’ll appeal.
Faced with increasing efforts to set limits on their products, the tech companies have underscored the protections they’ve already put in place.
Representatives for TikTok pointed Gothamist to a slew of restrictions that require users to be 13, bar direct messaging under 16, and maintain a default limit of an hour a day, which can be overridden with a passcode.
Representatives from Meta, which owns Instagram, and Alphabet, which owns Google and YouTube, did not respond to requests for comment, but their websites outline some of their safeguards.
Instagram offers teen accounts with some increased privacy protections, restrictions around certain “sensitive content” like fighting and cosmetic procedures, and time limit “reminders.”
YouTube offers limited account options for children, teens and preteens, and timers for parents. The company has said it is also using AI to identify young users and tailor content.
Stevens said she has been getting feedback from the tech companies and students and is open to revising the legislation.