Privacy & Security

What Teachers Need to Know About Changes to Instagram Teen Accounts

By Lauraine Langreo — September 20, 2024 4 min read
Close up photo of Black teen looking at Instagram photos on her cellphone.
  • Save to favorites
  • Print

Instagram is launching “teen accounts,” its latest effort to make the platform safer for its younger users amid rising concerns about how social media affects youth mental health.

Anyone under 18 who signs up for Instagram or already has an account will be placed into a teen account, which will be private by default and have restrictions on what kinds of content users can view, according to Meta, the parent company of the social media app.

The changes, announced Sept. 17, come as Meta faces multiple lawsuits from states and school districts claiming that the company knowingly ignored the negative impact of its platforms on young people’s mental health.

The announcement also arrives eight months after Meta said it was making it harder for teenagers to view content on its platforms that was related to self-harm, suicide, nudity, or eating disorders, even if it’s posted by someone they follow. The company at the time said those changes would be implemented on Instagram and Facebook within months.

Instagram teen accounts will have the strictest settings by default

Along with making teens’ accounts private (meaning they’d have to accept who can follow and see their account) by default, the new Instagram changes will make it so teens can only receive messages from people they follow or are already connected to, according to Meta. And “sensitive content,” such as videos of people fighting or those promoting cosmetic procedures, will be limited, Meta said.

Teens will get notifications if they’re on Instagram for more than 60 minutes each day, and a “sleep mode” will be enabled that turns off notifications and sends auto-replies to direct messages from 10 p.m. to 7 a.m., Meta said.

These settings will be turned on for all teens. Sixteen- and 17-year-olds will be able to turn them off, while kids under 16 will need their parents’ permission to do so.

See Also

Conceptual image of a young person engaged in social media.
YoGinta/iStock/Getty

Meta acknowledged that teenagers may lie about their age and said it will require them to verify their ages in more instances, like if they try to create a new account with an adult birthday. Meta said it uses several ways to verify age: Teens can upload their ID, record a video selfie, or ask mutual friends to verify their age. The company also said it is building technology that proactively finds accounts of teens who pretend to be adults and automatically places them into the restricted teen accounts.

These changes “can be really beneficial,” said Amelia Vance, the president of the Public Interest Privacy Center, which advocates for effective, ethical, and equitable privacy safeguards for all children and students. “I’ve heard a lot of parents and teens express that they do want a level of security or oversight.”

Meta tries to balance parental controls with teen autonomy

Anjali Verma, the president of the National Student Council and a senior at a charter school in West Chester, Pa., is on board with private accounts for teens under 16. In fact, she had a private account until she turned 17.

“It’s really important that Instagram and Meta are taking the steps to be proactive about protecting teens online,” Anjali said.

But she’s skeptical of how effective these actions really will be. There’s still work Meta could do to curb “the most addictive” parts of the app, such as endless scrolling and videos that pop up and immediately begin playing one after another, Anjali said.

Anjali said she’s also unsure about whether it’s a good idea to have parents tied to their teens’ accounts.

“I don’t think all teens necessarily have the best relationships with their parents,” she said. For some teens, social media is their outlet to express what they might not be comfortable sharing with their parents or guardians.

Balancing parental control with giving teens autonomy as they learn and grow is something social media companies and any regulations need to be careful about, Vance said.

See Also

Florida Gov. Ron DeSantis delivers remarks during a press conference at the Central Florida Tourism Oversight District headquarters at Walt Disney World, in Lake Buena Vista, Fla., on Feb. 22, 2024. Florida will have one of the country's most restrictive social media bans for minors — if it withstands expected legal challenges — under a bill signed by Republican Florida Gov. Ron DeSantis on March 25, 2024.
Florida Gov. Ron DeSantis delivers remarks during a press conference in Lake Buena Vista on Feb. 22.
Joe Burbank/Orlando Sentinel via AP

Yvonne Johnson, the president of the National Parent Teacher Association, applauded the changes to Instagram in a statement in Meta’s press release, saying that these steps “empower parents and deliver safer, more age-appropriate experiences on the platform.”

Not all parents think Meta’s changes are enough, however.

“This is nowhere near sufficient to address the deep concerns parents and families have about social media,” said Keri Rodrigues, the president of the National Parents Union. “Teens will always find a way around these things. Kids have been doing this for a very long time, and parents do not trust that this is going to be sufficient.”

What Congress is doing about children’s online safety

It’s also not enough to trust that Meta and other social media companies will self-regulate to make sure people are safe on their platforms, Rodrigues said. There need to be laws in place that hold companies accountable for what’s happening on their platforms, she said.

Congress has been considering a couple bills related to children’s online safety: The Kids Online Safety Act (KOSA), which would require social media companies to take reasonable steps to prevent and mitigate harms to children, and the Children and Teens’ Online Privacy Protection Act (also known as COPPA 2.0), which would update online data privacy rules.

The Senate in late July passed the Kids Online Safety and Privacy Act, which combines KOSA and COPPA 2.0. The House has yet to vote on its versions of KOSA and COPPA 2.0.

Even with these changes from Meta and possible social media regulations, experts say it’s still important to teach kids how to navigate the digital world and ensure that they have the skills they need to keep themselves safe.

“It’s the kids who work around [those changes] that I would be worried about,” said Beth Houf, the principal of Capital City High School in Jefferson City, Mo. “So how are we continuing to bring the education lens to this?”

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Special Education Webinar
How Early Adopters of Remote Therapy are Improving IEPs
Learn how schools are using remote therapy to improve IEP compliance & scalability while delivering outcomes comparable to onsite providers.
Content provided by Huddle Up
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Teaching Webinar
Cohesive Instruction, Connected Schools: Scale Excellence District-Wide with the Right Technology
Ensure all students receive high-quality instruction with a cohesive educational framework. Learn how to empower teachers and leverage technology.
Content provided by Instructure
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
How to Use Data to Combat Bullying and Enhance School Safety
Join our webinar to learn how data can help identify bullying, implement effective interventions, & foster student well-being.
Content provided by Panorama Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Privacy & Security Quiz
Quiz Yourself: How Much Do You Know About Cybersecurity For Schools And Districts?
Answer 6 questions about actionable cybersecurity solutions.
Content provided by FlexPoint Education Cloud
Privacy & Security What Schools Need to Know About These Federal Data-Privacy Bills
Congress is considering at least three data-privacy bills that could have big implications for schools.
5 min read
Photo illustration of a key on a digital background of zeros and ones.
E+
Privacy & Security A New Federal Taskforce Targets Cybersecurity in Schools
The “government coordinating council" aims to provide training, policies, and best practices.
3 min read
Illustration of computer and lock.
iStock / Getty Images Plus
Privacy & Security Q&A Why One Tech Leader Prioritizes Explaining Student Data Privacy to Teachers
Jun Kim, the director of technology for an Oklahoma school district, helped build a statewide database of vetted learning platforms.
3 min read
Jun Kim, Director of Technology for Moore Public Schools, poses for a portrait outside the Center for Technology on Dec. 13, 2023 in Moore, Okla.
Jun Kim, is the director of technology for the Moore school district in Moore, Okla., He has made securing student data a priority for the district and the state.
Brett Deering for Education Week