NEW YORK — New York Attorney General Letitia James recently released proposed rules on how social media companies should restrict addictive features on their platforms to protect children’s mental health, as required by the Stop Addictive Feeds Exploitation (SAFE) for Kids Act.
According to a press release from the NYAG office, the SAFE for Kids Act, sponsored by Senator Andrew Gounardes and Assemblymember Nily Rozic, and signed into law by Gov. Kathy Hochul, requires social media companies to restrict algorithmically personalized feeds, or addictive feeds, and nighttime notifications for users under the age of 18 unless parental consent is granted.
Addictive feeds and nighttime notifications are tied to depression, anxiety, eating and sleep disorders, and other mental health issues for children and teenagers. The proposed rules released today explain which companies must comply with the law and outline standards to determine users’ age and obtain parental consent. A public comment period on the proposed rules is open for 60 days.
“Children and teenagers are struggling with high rates of anxiety and depression because of addictive features on social media platforms,” James said in the release. “I am proud to have worked alongside Governor Hochul, Senator Gounardes, and Assemblymember Rozic to pass the nation’s strongest legislation to protect children from the dangers of social media.
“The proposed rules released by my office today will help us tackle the youth mental health crisis and make social media safer for kids and families. This is an issue that affects all of us, and I encourage parents, educators, young people, industry groups, and others to review the proposed rules and submit a comment during the public comment period.”
“I was proud to sign the nation’s leading legislation targeting addictive social media feeds, the SAFE for Kids Act, that protects New York’s young people from social media’s damaging effects,” Hochul added. “We know that kids are happier and healthier when they’re learning and growing, not clicking and scrolling. I thank Attorney General James and her team for their work on drafting the regulations for this critically important legislation.”
Algorithmically personalized feeds, or addictive feeds, recommend or personalize content for users in an endless stream based on data that the platform gathered about the user. They are a feature designed to encourage a user to continue to use and return to a platform. Content displayed in addictive feeds is often from accounts that a user does not follow and is often displayed out of chronological order.
Algorithmically personalized feeds are known to drive unhealthy levels of social media use in minors that can affect their mental health. Research shows that children as young as 10 to 14 years old experience addictive use of social media, and the more time children spend online, the more likely they are to experience negative mental health outcomes such as depression, anxiety, and eating and sleep disorders.
The SAFE for Kids Act addresses these mental health concerns for children by requiring social media companies to restrict addictive feeds for users under 18. Instead of the default algorithmically personalized feeds that keep young people on the platform, users under 18 will be shown content only from other accounts they follow or otherwise select in a set sequence, such as chronological order unless they get parental consent for an algorithmic personalized feed.
Users cannot be cut off from the platform simply because they don’t want or don’t have parental consent for an addictive feed. Instead, all users will still be able to access all of the same content they can access now.
The law also prohibits social media platforms from sending notifications to users under 18 from 12:00 a.m. to 6:00 a.m. without parental consent.
The SAFE for Kids Act authorizes the Office of the Attorney General (OAG) to promulgate rules on how companies should comply with the law before the statute goes into effect, including rules setting standards to determine a user’s age and parental consent. Before drafting the proposed rules, OAG issued an advanced notice of proposed rulemaking on August 1, 2024, and provided the public a 60-day period to submit comments.
The OAG reviewed all comments that were submitted and used the public’s input, industry research, and its significant experience to inform the proposed rules.
Age Assurance
• For users above the age of 18, social media companies must ascertain that the user is an adult before allowing them to access algorithmic feeds and/or nighttime notifications. Companies may confirm a user’s age using a number of existing methods, as long as the methods are shown to be effective and protect users’ data. Companies can use options, such as:
— Requesting an uploaded image or video; or
— Verifying a user’s email address or phone number to cross-check other information that reflects a user’s age.
• Social media companies must offer at least one other alternative method for age assurance besides providing a government-issued ID.
• Any information used to determine age or obtain parental consent must not be used for any other purpose and must be deleted or de-identified immediately after its intended use.
• Young users who turn 18 must have an option to update their age status on the platform.
• Social media companies must choose an age assurance method with a high accuracy rate, conduct annual testing, and retain the results of the testing for a minimum of 10 years.
Parental Consent
• Social media companies must first receive a minor’s approval to request parental consent for algorithmic feeds and/or nighttime notifications. Once a minor approves, the platform may seek verifiable parental consent to allow a minor to access algorithmic feeds and/or nighttime notifications.
• The platform may not block the minor from generally accessing the platform or its content through, for example, searches, simply because they or their parent has refused to consent.
• The platform is not required to show parents the user’s search history or topics of interest to obtain parental consent.
• Parents and minors must also have the option to withdraw their consent at any time.
These proposed rules apply to companies that display user-generated content and have users who spend at least 20 percent of their time on the platform’s addictive feeds.
A public comment period on the proposed rules is open for 60 days and the deadline to submit comments is Dec. 1. The OAG seeks comment on every aspect of the proposed rules, including personal experiences, research, technology standards, and industry information, together with examples, data, and analysis in support of any comment.
The OAG seeks comments from parents and other caretakers of children, young people, educators, members of academia, mental health professionals, consumer and child advocacy groups, privacy advocacy groups, industry participants, and other members of the public.
To submit a comment on the proposed rules, email ProtectNYKidsOnline@ag.ny.gov.
After the public comment period closes, OAG has one year to finalize the rules. Once the final rules are released, the SAFE for Kids Act goes into effect after 180 days.
For companies that violate the SAFE for Kids Act, the law authorizes OAG to bring an action to stop violations as well as to seek civil penalties of up to $5,000 per violation, among other remedies.
In October 2023, James, Hochul, Gounardes, and Rozic took action to protect children online by championing the SAFE for Kids Act and the New York Child Data Protection Act, which prohibits online platforms from collecting, using, sharing or selling personal data of anyone under the age of 18, unless they receive informed consent or unless doing so is strictly necessary for the purpose of the online platform.
In June 2024, the SAFE for Kids Act and the New York Child Data Protection Act were signed into law. The Child Data Protection Act is in effect.